You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Ian jonhson <jo...@gmail.com> on 2009/04/30 05:19:10 UTC

Load .so library error when Hadoop calls JNI interfaces

Dear all,

I wrote a plugin codes for Hadoop, which calls the interfaces
in Cpp-built .so library. The plugin codes are written in java,
so I prepared a JNI class to encapsulate the C interfaces.

The java codes can be executed successfully when I compiled
it and run it standalone. However, it does not work when I embedded
in Hadoop. The exception shown out is (found in Hadoop logs):


------------  screen dump  ---------------------

# grep myClass logs/* -r
logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
thread "JVM Runner jvm_200904261632_0001_m_-1217897050 spawned."
java.lang.UnsatisfiedLinkError:
org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
thread "JVM Runner jvm_200904261632_0001_m_-1887898624 spawned."
java.lang.UnsatisfiedLinkError:
org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
...

--------------------------------------------------------

It seems the library can not be loaded in Hadoop. My codes
(myClass.java) is like:


---------------  myClass.java  ------------------

public class myClass
{

        public static final Log LOG =
                    LogFactory.getLog("org.apache.hadoop.mapred.myClass");


        public myClass()   {

                try {
                        //System.setProperty("java.library.path",
"/usr/local/lib");

                        /* The above line does not work, so I have to
do something
                         * like following line.
                         */
                        addDir(new String("/usr/local/lib"));
                        System.loadLibrary("myclass");
                }
                catch(UnsatisfiedLinkError e) {
                        LOG.info( "Cannot load library:\n " +
                                e.toString() );
                }
                catch(IOException ioe) {
                        LOG.info( "IO error:\n " +
                                ioe.toString() );
                }

        }

	/* Since the System.setProperty() does not work, I have to add the following
	 * function to force the path is added in java.library.path
	 */
        public static void addDir(String s) throws IOException {

            try {
                        Field field =
ClassLoader.class.getDeclaredField("usr_paths");
                         field.setAccessible(true);
                        String[] paths = (String[])field.get(null);
                        for (int i = 0; i < paths.length; i++) {
                            if (s.equals(paths[i])) {
                                return;
                            }
                        }
                        String[] tmp = new String[paths.length+1];
                        System.arraycopy(paths,0,tmp,0,paths.length);
                        tmp[paths.length] = s;

                        field.set(null,tmp);
                    } catch (IllegalAccessException e) {
                        throw new IOException("Failed to get
permissions to set library path");
                    } catch (NoSuchFieldException e) {
                        throw new IOException("Failed to get field
handle to set library path");
            }
        }

        public native int myClassfsMount(String subsys);
        public native int myClassfsUmount(String subsys);


}

--------------------------------------------------------


I don't know what missed in my codes and am wondering whether there are any
rules in Hadoop I should obey if I want to  achieve my target.

FYI, the myClassfsMount() and myClassfsUmount() will open a socket to call
services from a daemon. I would better if this design did not cause the fail in
my codes.


Any comments?


Thanks in advance,

Ian

Re: Load .so library error when Hadoop calls JNI interfaces

Posted by jason hadoop <ja...@gmail.com>.
I believe it is early to mid june for the paper version.
I think all the mainline chapters are available as alphas right now, and
pretty close to the final versions.

On Thu, Apr 30, 2009 at 2:43 AM, Rakhi Khatwani <ra...@gmail.com>wrote:

> Hi Jason,
>             when will the full version of your book be available??
>
> On Thu, Apr 30, 2009 at 8:51 AM, jason hadoop <jason.hadoop@gmail.com
> >wrote:
>
> > You need to make sure that the shared library is available on the
> > tasktracker nodes, either by installing it, or by pushing it around via
> the
> > distributed cache
> >
> >
> >
> > On Wed, Apr 29, 2009 at 8:19 PM, Ian jonhson <jo...@gmail.com>
> > wrote:
> >
> > > Dear all,
> > >
> > > I wrote a plugin codes for Hadoop, which calls the interfaces
> > > in Cpp-built .so library. The plugin codes are written in java,
> > > so I prepared a JNI class to encapsulate the C interfaces.
> > >
> > > The java codes can be executed successfully when I compiled
> > > it and run it standalone. However, it does not work when I embedded
> > > in Hadoop. The exception shown out is (found in Hadoop logs):
> > >
> > >
> > > ------------  screen dump  ---------------------
> > >
> > > # grep myClass logs/* -r
> > > logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
> > > thread "JVM Runner jvm_200904261632_0001_m_-1217897050 spawned."
> > > java.lang.UnsatisfiedLinkError:
> > > org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
> > > logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
> > > org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
> > > logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
> > > thread "JVM Runner jvm_200904261632_0001_m_-1887898624 spawned."
> > > java.lang.UnsatisfiedLinkError:
> > > org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
> > > logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
> > > org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
> > > ...
> > >
> > > --------------------------------------------------------
> > >
> > > It seems the library can not be loaded in Hadoop. My codes
> > > (myClass.java) is like:
> > >
> > >
> > > ---------------  myClass.java  ------------------
> > >
> > > public class myClass
> > > {
> > >
> > >        public static final Log LOG =
> > >
>  LogFactory.getLog("org.apache.hadoop.mapred.myClass");
> > >
> > >
> > >        public myClass()   {
> > >
> > >                try {
> > >                        //System.setProperty("java.library.path",
> > > "/usr/local/lib");
> > >
> > >                        /* The above line does not work, so I have to
> > > do something
> > >                         * like following line.
> > >                         */
> > >                        addDir(new String("/usr/local/lib"));
> > >                        System.loadLibrary("myclass");
> > >                }
> > >                catch(UnsatisfiedLinkError e) {
> > >                        LOG.info( "Cannot load library:\n " +
> > >                                e.toString() );
> > >                }
> > >                catch(IOException ioe) {
> > >                        LOG.info( "IO error:\n " +
> > >                                ioe.toString() );
> > >                }
> > >
> > >        }
> > >
> > >        /* Since the System.setProperty() does not work, I have to add
> the
> > > following
> > >         * function to force the path is added in java.library.path
> > >         */
> > >        public static void addDir(String s) throws IOException {
> > >
> > >            try {
> > >                        Field field =
> > > ClassLoader.class.getDeclaredField("usr_paths");
> > >                         field.setAccessible(true);
> > >                        String[] paths = (String[])field.get(null);
> > >                        for (int i = 0; i < paths.length; i++) {
> > >                            if (s.equals(paths[i])) {
> > >                                return;
> > >                            }
> > >                        }
> > >                        String[] tmp = new String[paths.length+1];
> > >                        System.arraycopy(paths,0,tmp,0,paths.length);
> > >                        tmp[paths.length] = s;
> > >
> > >                        field.set(null,tmp);
> > >                    } catch (IllegalAccessException e) {
> > >                        throw new IOException("Failed to get
> > > permissions to set library path");
> > >                    } catch (NoSuchFieldException e) {
> > >                        throw new IOException("Failed to get field
> > > handle to set library path");
> > >            }
> > >        }
> > >
> > >        public native int myClassfsMount(String subsys);
> > >        public native int myClassfsUmount(String subsys);
> > >
> > >
> > > }
> > >
> > > --------------------------------------------------------
> > >
> > >
> > > I don't know what missed in my codes and am wondering whether there are
> > any
> > > rules in Hadoop I should obey if I want to  achieve my target.
> > >
> > > FYI, the myClassfsMount() and myClassfsUmount() will open a socket to
> > call
> > > services from a daemon. I would better if this design did not cause the
> > > fail in
> > > my codes.
> > >
> > >
> > > Any comments?
> > >
> > >
> > > Thanks in advance,
> > >
> > > Ian
> > >
> >
> >
> >
> > --
> > Alpha Chapters of my book on Hadoop are available
> > http://www.apress.com/book/view/9781430219422
> >
>



-- 
Alpha Chapters of my book on Hadoop are available
http://www.apress.com/book/view/9781430219422

Re: Load .so library error when Hadoop calls JNI interfaces

Posted by Rakhi Khatwani <ra...@gmail.com>.
Hi Jason,
             when will the full version of your book be available??

On Thu, Apr 30, 2009 at 8:51 AM, jason hadoop <ja...@gmail.com>wrote:

> You need to make sure that the shared library is available on the
> tasktracker nodes, either by installing it, or by pushing it around via the
> distributed cache
>
>
>
> On Wed, Apr 29, 2009 at 8:19 PM, Ian jonhson <jo...@gmail.com>
> wrote:
>
> > Dear all,
> >
> > I wrote a plugin codes for Hadoop, which calls the interfaces
> > in Cpp-built .so library. The plugin codes are written in java,
> > so I prepared a JNI class to encapsulate the C interfaces.
> >
> > The java codes can be executed successfully when I compiled
> > it and run it standalone. However, it does not work when I embedded
> > in Hadoop. The exception shown out is (found in Hadoop logs):
> >
> >
> > ------------  screen dump  ---------------------
> >
> > # grep myClass logs/* -r
> > logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
> > thread "JVM Runner jvm_200904261632_0001_m_-1217897050 spawned."
> > java.lang.UnsatisfiedLinkError:
> > org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
> > logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
> > org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
> > logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
> > thread "JVM Runner jvm_200904261632_0001_m_-1887898624 spawned."
> > java.lang.UnsatisfiedLinkError:
> > org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
> > logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
> > org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
> > ...
> >
> > --------------------------------------------------------
> >
> > It seems the library can not be loaded in Hadoop. My codes
> > (myClass.java) is like:
> >
> >
> > ---------------  myClass.java  ------------------
> >
> > public class myClass
> > {
> >
> >        public static final Log LOG =
> >                    LogFactory.getLog("org.apache.hadoop.mapred.myClass");
> >
> >
> >        public myClass()   {
> >
> >                try {
> >                        //System.setProperty("java.library.path",
> > "/usr/local/lib");
> >
> >                        /* The above line does not work, so I have to
> > do something
> >                         * like following line.
> >                         */
> >                        addDir(new String("/usr/local/lib"));
> >                        System.loadLibrary("myclass");
> >                }
> >                catch(UnsatisfiedLinkError e) {
> >                        LOG.info( "Cannot load library:\n " +
> >                                e.toString() );
> >                }
> >                catch(IOException ioe) {
> >                        LOG.info( "IO error:\n " +
> >                                ioe.toString() );
> >                }
> >
> >        }
> >
> >        /* Since the System.setProperty() does not work, I have to add the
> > following
> >         * function to force the path is added in java.library.path
> >         */
> >        public static void addDir(String s) throws IOException {
> >
> >            try {
> >                        Field field =
> > ClassLoader.class.getDeclaredField("usr_paths");
> >                         field.setAccessible(true);
> >                        String[] paths = (String[])field.get(null);
> >                        for (int i = 0; i < paths.length; i++) {
> >                            if (s.equals(paths[i])) {
> >                                return;
> >                            }
> >                        }
> >                        String[] tmp = new String[paths.length+1];
> >                        System.arraycopy(paths,0,tmp,0,paths.length);
> >                        tmp[paths.length] = s;
> >
> >                        field.set(null,tmp);
> >                    } catch (IllegalAccessException e) {
> >                        throw new IOException("Failed to get
> > permissions to set library path");
> >                    } catch (NoSuchFieldException e) {
> >                        throw new IOException("Failed to get field
> > handle to set library path");
> >            }
> >        }
> >
> >        public native int myClassfsMount(String subsys);
> >        public native int myClassfsUmount(String subsys);
> >
> >
> > }
> >
> > --------------------------------------------------------
> >
> >
> > I don't know what missed in my codes and am wondering whether there are
> any
> > rules in Hadoop I should obey if I want to  achieve my target.
> >
> > FYI, the myClassfsMount() and myClassfsUmount() will open a socket to
> call
> > services from a daemon. I would better if this design did not cause the
> > fail in
> > my codes.
> >
> >
> > Any comments?
> >
> >
> > Thanks in advance,
> >
> > Ian
> >
>
>
>
> --
> Alpha Chapters of my book on Hadoop are available
> http://www.apress.com/book/view/9781430219422
>

Re: Load .so library error when Hadoop calls JNI interfaces

Posted by Ian jonhson <jo...@gmail.com>.
2009/4/30 He Yongqiang <he...@software.ict.ac.cn>:
> put your .so file in every traker's Hadoop-install/lib/native/Linux-xxx-xx/
>
> Or
>
> In your code,try to do
>
>  String oldPath=System.getProperty("java.library.path");
>  System.setProperty("java.library.path", oldPath==null?
> local_path_of_lib_file:oldPath+pathSeparator +local_path_of_lib_file))
>  System.loadLibrary("XXX");
>


I have copied .so and .a files to Hadoop-install/lib/native/Linux-xxx-xx/
and called  System.loadLibrary("XXX"); in my codes, but nothing happens.

Then, I tried the second solution mentioned above, same problem is
occurred (the .so files have been in native directory).



> However, you also need to fetch the library to local through
> DistributedCache( like jason said) or putting and getting it from hdfs by
> yourself.
>

Does I need to copy libraries in local machine since I run the Hadoop in
single node?

How can I do it either by fetching or putting from hdfs?


> On 09-4-30 下午5:14, "Ian jonhson" <jo...@gmail.com> wrote:
>
>> You mean that the current hadoop does not support JNI calls, right?
>> Are there any solution to achieve the calls from C interfaces?
>>
>> 2009/4/30 He Yongqiang <he...@software.ict.ac.cn>:
>>> Does hadoop now support jni calls in Mappers or Reducers? If yes, how? If
>>> not, I think we should create a jira issue for supporting that.
>>>
>>>

Re: Load .so library error when Hadoop calls JNI interfaces

Posted by He Yongqiang <he...@software.ict.ac.cn>.
put your .so file in every traker's Hadoop-install/lib/native/Linux-xxx-xx/

Or
 
In your code,try to do
  
  String oldPath=System.getProperty("java.library.path");
  System.setProperty("java.library.path", oldPath==null?
local_path_of_lib_file:oldPath+pathSeparator +local_path_of_lib_file))
  System.loadLibrary("XXX");

However, you also need to fetch the library to local through
DistributedCache( like jason said) or putting and getting it from hdfs by
yourself.

On 09-4-30 下午5:14, "Ian jonhson" <jo...@gmail.com> wrote:

> You mean that the current hadoop does not support JNI calls, right?
> Are there any solution to achieve the calls from C interfaces?
> 
> 2009/4/30 He Yongqiang <he...@software.ict.ac.cn>:
>> Does hadoop now support jni calls in Mappers or Reducers? If yes, how? If
>> not, I think we should create a jira issue for supporting that.
>> 
>> 
>> On 09-4-30 下午4:02, "Ian jonhson" <jo...@gmail.com> wrote:
>> 
>>> Thanks for answering.
>>> 
>>> I run my Hadoop in single node, not cluster mode.
>>> 
>>> 
>>> 
>>> On Thu, Apr 30, 2009 at 11:21 AM, jason hadoop <ja...@gmail.com>
>>> wrote:
>>>> You need to make sure that the shared library is available on the
>>>> tasktracker nodes, either by installing it, or by pushing it around via the
>>>> distributed cache
>>>> 
>>>> 
>>>> 
>>>> On Wed, Apr 29, 2009 at 8:19 PM, Ian jonhson <jo...@gmail.com> wrote:
>>>> 
>>>>> Dear all,
>>>>> 
>>>>> I wrote a plugin codes for Hadoop, which calls the interfaces
>>>>> in Cpp-built .so library. The plugin codes are written in java,
>>>>> so I prepared a JNI class to encapsulate the C interfaces.
>>>>> 
>>>>> The java codes can be executed successfully when I compiled
>>>>> it and run it standalone. However, it does not work when I embedded
>>>>> in Hadoop. The exception shown out is (found in Hadoop logs):
>>>>> 
>>>>> 
>>>>> ------------  screen dump  ---------------------
>>>>> 
>>>>> # grep myClass logs/* -r
>>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
>>>>> thread "JVM Runner jvm_200904261632_0001_m_-1217897050 spawned."
>>>>> java.lang.UnsatisfiedLinkError:
>>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
>>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
>>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
>>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
>>>>> thread "JVM Runner jvm_200904261632_0001_m_-1887898624 spawned."
>>>>> java.lang.UnsatisfiedLinkError:
>>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
>>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
>>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
>>>>> ...
>>>>> 
>>>>> --------------------------------------------------------
>>>>> 
>>>>> It seems the library can not be loaded in Hadoop. My codes
>>>>> (myClass.java) is like:
>>>>> 
>>>>> 
>>>>> ---------------  myClass.java  ------------------
>>>>> 
>>>>> public class myClass
>>>>> {
>>>>> 
>>>>>        public static final Log LOG =
>>>>>                    LogFactory.getLog("org.apache.hadoop.mapred.myClass");
>>>>> 
>>>>> 
>>>>>        public myClass()   {
>>>>> 
>>>>>                try {
>>>>>                        //System.setProperty("java.library.path",
>>>>> "/usr/local/lib");
>>>>> 
>>>>>                        /* The above line does not work, so I have to
>>>>> do something
>>>>>                         * like following line.
>>>>>                         */
>>>>>                        addDir(new String("/usr/local/lib"));
>>>>>                        System.loadLibrary("myclass");
>>>>>                }
>>>>>                catch(UnsatisfiedLinkError e) {
>>>>>                        LOG.info( "Cannot load library:\n " +
>>>>>                                e.toString() );
>>>>>                }
>>>>>                catch(IOException ioe) {
>>>>>                        LOG.info( "IO error:\n " +
>>>>>                                ioe.toString() );
>>>>>                }
>>>>> 
>>>>>        }
>>>>> 
>>>>>        /* Since the System.setProperty() does not work, I have to add the
>>>>> following
>>>>>         * function to force the path is added in java.library.path
>>>>>         */
>>>>>        public static void addDir(String s) throws IOException {
>>>>> 
>>>>>            try {
>>>>>                        Field field =
>>>>> ClassLoader.class.getDeclaredField("usr_paths");
>>>>>                         field.setAccessible(true);
>>>>>                        String[] paths = (String[])field.get(null);
>>>>>                        for (int i = 0; i < paths.length; i++) {
>>>>>                            if (s.equals(paths[i])) {
>>>>>                                return;
>>>>>                            }
>>>>>                        }
>>>>>                        String[] tmp = new String[paths.length+1];
>>>>>                        System.arraycopy(paths,0,tmp,0,paths.length);
>>>>>                        tmp[paths.length] = s;
>>>>> 
>>>>>                        field.set(null,tmp);
>>>>>                    } catch (IllegalAccessException e) {
>>>>>                        throw new IOException("Failed to get
>>>>> permissions to set library path");
>>>>>                    } catch (NoSuchFieldException e) {
>>>>>                        throw new IOException("Failed to get field
>>>>> handle to set library path");
>>>>>            }
>>>>>        }
>>>>> 
>>>>>        public native int myClassfsMount(String subsys);
>>>>>        public native int myClassfsUmount(String subsys);
>>>>> 
>>>>> 
>>>>> }
>>>>> 
>>>>> --------------------------------------------------------
>>>>> 
>>>>> 
>>>>> I don't know what missed in my codes and am wondering whether there are
>>>>> any
>>>>> rules in Hadoop I should obey if I want to  achieve my target.
>>>>> 
>>>>> FYI, the myClassfsMount() and myClassfsUmount() will open a socket to call
>>>>> services from a daemon. I would better if this design did not cause the
>>>>> fail in
>>>>> my codes.
>>>>> 
>>>>> 
>>>>> Any comments?
>>>>> 
>>>>> 
>>>>> Thanks in advance,
>>>>> 
>>>>> Ian
>>>>> 
>>>> 
>>>> 
>>>> 
>>>> --
>>>> Alpha Chapters of my book on Hadoop are available
>>>> http://www.apress.com/book/view/9781430219422
>>>> 
>>> 
>>> 
>> 
>> 
>> 
> 
> 



Re: Load .so library error when Hadoop calls JNI interfaces

Posted by Ian jonhson <jo...@gmail.com>.
You mean that the current hadoop does not support JNI calls, right?
Are there any solution to achieve the calls from C interfaces?

2009/4/30 He Yongqiang <he...@software.ict.ac.cn>:
> Does hadoop now support jni calls in Mappers or Reducers? If yes, how? If
> not, I think we should create a jira issue for supporting that.
>
>
> On 09-4-30 下午4:02, "Ian jonhson" <jo...@gmail.com> wrote:
>
>> Thanks for answering.
>>
>> I run my Hadoop in single node, not cluster mode.
>>
>>
>>
>> On Thu, Apr 30, 2009 at 11:21 AM, jason hadoop <ja...@gmail.com> wrote:
>>> You need to make sure that the shared library is available on the
>>> tasktracker nodes, either by installing it, or by pushing it around via the
>>> distributed cache
>>>
>>>
>>>
>>> On Wed, Apr 29, 2009 at 8:19 PM, Ian jonhson <jo...@gmail.com> wrote:
>>>
>>>> Dear all,
>>>>
>>>> I wrote a plugin codes for Hadoop, which calls the interfaces
>>>> in Cpp-built .so library. The plugin codes are written in java,
>>>> so I prepared a JNI class to encapsulate the C interfaces.
>>>>
>>>> The java codes can be executed successfully when I compiled
>>>> it and run it standalone. However, it does not work when I embedded
>>>> in Hadoop. The exception shown out is (found in Hadoop logs):
>>>>
>>>>
>>>> ------------  screen dump  ---------------------
>>>>
>>>> # grep myClass logs/* -r
>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
>>>> thread "JVM Runner jvm_200904261632_0001_m_-1217897050 spawned."
>>>> java.lang.UnsatisfiedLinkError:
>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
>>>> thread "JVM Runner jvm_200904261632_0001_m_-1887898624 spawned."
>>>> java.lang.UnsatisfiedLinkError:
>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
>>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
>>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
>>>> ...
>>>>
>>>> --------------------------------------------------------
>>>>
>>>> It seems the library can not be loaded in Hadoop. My codes
>>>> (myClass.java) is like:
>>>>
>>>>
>>>> ---------------  myClass.java  ------------------
>>>>
>>>> public class myClass
>>>> {
>>>>
>>>>        public static final Log LOG =
>>>>                    LogFactory.getLog("org.apache.hadoop.mapred.myClass");
>>>>
>>>>
>>>>        public myClass()   {
>>>>
>>>>                try {
>>>>                        //System.setProperty("java.library.path",
>>>> "/usr/local/lib");
>>>>
>>>>                        /* The above line does not work, so I have to
>>>> do something
>>>>                         * like following line.
>>>>                         */
>>>>                        addDir(new String("/usr/local/lib"));
>>>>                        System.loadLibrary("myclass");
>>>>                }
>>>>                catch(UnsatisfiedLinkError e) {
>>>>                        LOG.info( "Cannot load library:\n " +
>>>>                                e.toString() );
>>>>                }
>>>>                catch(IOException ioe) {
>>>>                        LOG.info( "IO error:\n " +
>>>>                                ioe.toString() );
>>>>                }
>>>>
>>>>        }
>>>>
>>>>        /* Since the System.setProperty() does not work, I have to add the
>>>> following
>>>>         * function to force the path is added in java.library.path
>>>>         */
>>>>        public static void addDir(String s) throws IOException {
>>>>
>>>>            try {
>>>>                        Field field =
>>>> ClassLoader.class.getDeclaredField("usr_paths");
>>>>                         field.setAccessible(true);
>>>>                        String[] paths = (String[])field.get(null);
>>>>                        for (int i = 0; i < paths.length; i++) {
>>>>                            if (s.equals(paths[i])) {
>>>>                                return;
>>>>                            }
>>>>                        }
>>>>                        String[] tmp = new String[paths.length+1];
>>>>                        System.arraycopy(paths,0,tmp,0,paths.length);
>>>>                        tmp[paths.length] = s;
>>>>
>>>>                        field.set(null,tmp);
>>>>                    } catch (IllegalAccessException e) {
>>>>                        throw new IOException("Failed to get
>>>> permissions to set library path");
>>>>                    } catch (NoSuchFieldException e) {
>>>>                        throw new IOException("Failed to get field
>>>> handle to set library path");
>>>>            }
>>>>        }
>>>>
>>>>        public native int myClassfsMount(String subsys);
>>>>        public native int myClassfsUmount(String subsys);
>>>>
>>>>
>>>> }
>>>>
>>>> --------------------------------------------------------
>>>>
>>>>
>>>> I don't know what missed in my codes and am wondering whether there are any
>>>> rules in Hadoop I should obey if I want to  achieve my target.
>>>>
>>>> FYI, the myClassfsMount() and myClassfsUmount() will open a socket to call
>>>> services from a daemon. I would better if this design did not cause the
>>>> fail in
>>>> my codes.
>>>>
>>>>
>>>> Any comments?
>>>>
>>>>
>>>> Thanks in advance,
>>>>
>>>> Ian
>>>>
>>>
>>>
>>>
>>> --
>>> Alpha Chapters of my book on Hadoop are available
>>> http://www.apress.com/book/view/9781430219422
>>>
>>
>>
>
>
>

Re: Load .so library error when Hadoop calls JNI interfaces

Posted by He Yongqiang <he...@software.ict.ac.cn>.
Does hadoop now support jni calls in Mappers or Reducers? If yes, how? If
not, I think we should create a jira issue for supporting that.


On 09-4-30 下午4:02, "Ian jonhson" <jo...@gmail.com> wrote:

> Thanks for answering.
> 
> I run my Hadoop in single node, not cluster mode.
> 
> 
> 
> On Thu, Apr 30, 2009 at 11:21 AM, jason hadoop <ja...@gmail.com> wrote:
>> You need to make sure that the shared library is available on the
>> tasktracker nodes, either by installing it, or by pushing it around via the
>> distributed cache
>> 
>> 
>> 
>> On Wed, Apr 29, 2009 at 8:19 PM, Ian jonhson <jo...@gmail.com> wrote:
>> 
>>> Dear all,
>>> 
>>> I wrote a plugin codes for Hadoop, which calls the interfaces
>>> in Cpp-built .so library. The plugin codes are written in java,
>>> so I prepared a JNI class to encapsulate the C interfaces.
>>> 
>>> The java codes can be executed successfully when I compiled
>>> it and run it standalone. However, it does not work when I embedded
>>> in Hadoop. The exception shown out is (found in Hadoop logs):
>>> 
>>> 
>>> ------------  screen dump  ---------------------
>>> 
>>> # grep myClass logs/* -r
>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
>>> thread "JVM Runner jvm_200904261632_0001_m_-1217897050 spawned."
>>> java.lang.UnsatisfiedLinkError:
>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
>>> thread "JVM Runner jvm_200904261632_0001_m_-1887898624 spawned."
>>> java.lang.UnsatisfiedLinkError:
>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
>>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
>>> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
>>> ...
>>> 
>>> --------------------------------------------------------
>>> 
>>> It seems the library can not be loaded in Hadoop. My codes
>>> (myClass.java) is like:
>>> 
>>> 
>>> ---------------  myClass.java  ------------------
>>> 
>>> public class myClass
>>> {
>>> 
>>>        public static final Log LOG =
>>>                    LogFactory.getLog("org.apache.hadoop.mapred.myClass");
>>> 
>>> 
>>>        public myClass()   {
>>> 
>>>                try {
>>>                        //System.setProperty("java.library.path",
>>> "/usr/local/lib");
>>> 
>>>                        /* The above line does not work, so I have to
>>> do something
>>>                         * like following line.
>>>                         */
>>>                        addDir(new String("/usr/local/lib"));
>>>                        System.loadLibrary("myclass");
>>>                }
>>>                catch(UnsatisfiedLinkError e) {
>>>                        LOG.info( "Cannot load library:\n " +
>>>                                e.toString() );
>>>                }
>>>                catch(IOException ioe) {
>>>                        LOG.info( "IO error:\n " +
>>>                                ioe.toString() );
>>>                }
>>> 
>>>        }
>>> 
>>>        /* Since the System.setProperty() does not work, I have to add the
>>> following
>>>         * function to force the path is added in java.library.path
>>>         */
>>>        public static void addDir(String s) throws IOException {
>>> 
>>>            try {
>>>                        Field field =
>>> ClassLoader.class.getDeclaredField("usr_paths");
>>>                         field.setAccessible(true);
>>>                        String[] paths = (String[])field.get(null);
>>>                        for (int i = 0; i < paths.length; i++) {
>>>                            if (s.equals(paths[i])) {
>>>                                return;
>>>                            }
>>>                        }
>>>                        String[] tmp = new String[paths.length+1];
>>>                        System.arraycopy(paths,0,tmp,0,paths.length);
>>>                        tmp[paths.length] = s;
>>> 
>>>                        field.set(null,tmp);
>>>                    } catch (IllegalAccessException e) {
>>>                        throw new IOException("Failed to get
>>> permissions to set library path");
>>>                    } catch (NoSuchFieldException e) {
>>>                        throw new IOException("Failed to get field
>>> handle to set library path");
>>>            }
>>>        }
>>> 
>>>        public native int myClassfsMount(String subsys);
>>>        public native int myClassfsUmount(String subsys);
>>> 
>>> 
>>> }
>>> 
>>> --------------------------------------------------------
>>> 
>>> 
>>> I don't know what missed in my codes and am wondering whether there are any
>>> rules in Hadoop I should obey if I want to  achieve my target.
>>> 
>>> FYI, the myClassfsMount() and myClassfsUmount() will open a socket to call
>>> services from a daemon. I would better if this design did not cause the
>>> fail in
>>> my codes.
>>> 
>>> 
>>> Any comments?
>>> 
>>> 
>>> Thanks in advance,
>>> 
>>> Ian
>>> 
>> 
>> 
>> 
>> --
>> Alpha Chapters of my book on Hadoop are available
>> http://www.apress.com/book/view/9781430219422
>> 
> 
> 



Re: Load .so library error when Hadoop calls JNI interfaces

Posted by Ian jonhson <jo...@gmail.com>.
Thanks for answering.

I run my Hadoop in single node, not cluster mode.



On Thu, Apr 30, 2009 at 11:21 AM, jason hadoop <ja...@gmail.com> wrote:
> You need to make sure that the shared library is available on the
> tasktracker nodes, either by installing it, or by pushing it around via the
> distributed cache
>
>
>
> On Wed, Apr 29, 2009 at 8:19 PM, Ian jonhson <jo...@gmail.com> wrote:
>
>> Dear all,
>>
>> I wrote a plugin codes for Hadoop, which calls the interfaces
>> in Cpp-built .so library. The plugin codes are written in java,
>> so I prepared a JNI class to encapsulate the C interfaces.
>>
>> The java codes can be executed successfully when I compiled
>> it and run it standalone. However, it does not work when I embedded
>> in Hadoop. The exception shown out is (found in Hadoop logs):
>>
>>
>> ------------  screen dump  ---------------------
>>
>> # grep myClass logs/* -r
>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
>> thread "JVM Runner jvm_200904261632_0001_m_-1217897050 spawned."
>> java.lang.UnsatisfiedLinkError:
>> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
>> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
>> thread "JVM Runner jvm_200904261632_0001_m_-1887898624 spawned."
>> java.lang.UnsatisfiedLinkError:
>> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
>> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
>> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
>> ...
>>
>> --------------------------------------------------------
>>
>> It seems the library can not be loaded in Hadoop. My codes
>> (myClass.java) is like:
>>
>>
>> ---------------  myClass.java  ------------------
>>
>> public class myClass
>> {
>>
>>        public static final Log LOG =
>>                    LogFactory.getLog("org.apache.hadoop.mapred.myClass");
>>
>>
>>        public myClass()   {
>>
>>                try {
>>                        //System.setProperty("java.library.path",
>> "/usr/local/lib");
>>
>>                        /* The above line does not work, so I have to
>> do something
>>                         * like following line.
>>                         */
>>                        addDir(new String("/usr/local/lib"));
>>                        System.loadLibrary("myclass");
>>                }
>>                catch(UnsatisfiedLinkError e) {
>>                        LOG.info( "Cannot load library:\n " +
>>                                e.toString() );
>>                }
>>                catch(IOException ioe) {
>>                        LOG.info( "IO error:\n " +
>>                                ioe.toString() );
>>                }
>>
>>        }
>>
>>        /* Since the System.setProperty() does not work, I have to add the
>> following
>>         * function to force the path is added in java.library.path
>>         */
>>        public static void addDir(String s) throws IOException {
>>
>>            try {
>>                        Field field =
>> ClassLoader.class.getDeclaredField("usr_paths");
>>                         field.setAccessible(true);
>>                        String[] paths = (String[])field.get(null);
>>                        for (int i = 0; i < paths.length; i++) {
>>                            if (s.equals(paths[i])) {
>>                                return;
>>                            }
>>                        }
>>                        String[] tmp = new String[paths.length+1];
>>                        System.arraycopy(paths,0,tmp,0,paths.length);
>>                        tmp[paths.length] = s;
>>
>>                        field.set(null,tmp);
>>                    } catch (IllegalAccessException e) {
>>                        throw new IOException("Failed to get
>> permissions to set library path");
>>                    } catch (NoSuchFieldException e) {
>>                        throw new IOException("Failed to get field
>> handle to set library path");
>>            }
>>        }
>>
>>        public native int myClassfsMount(String subsys);
>>        public native int myClassfsUmount(String subsys);
>>
>>
>> }
>>
>> --------------------------------------------------------
>>
>>
>> I don't know what missed in my codes and am wondering whether there are any
>> rules in Hadoop I should obey if I want to  achieve my target.
>>
>> FYI, the myClassfsMount() and myClassfsUmount() will open a socket to call
>> services from a daemon. I would better if this design did not cause the
>> fail in
>> my codes.
>>
>>
>> Any comments?
>>
>>
>> Thanks in advance,
>>
>> Ian
>>
>
>
>
> --
> Alpha Chapters of my book on Hadoop are available
> http://www.apress.com/book/view/9781430219422
>

Re: Load .so library error when Hadoop calls JNI interfaces

Posted by jason hadoop <ja...@gmail.com>.
You need to make sure that the shared library is available on the
tasktracker nodes, either by installing it, or by pushing it around via the
distributed cache



On Wed, Apr 29, 2009 at 8:19 PM, Ian jonhson <jo...@gmail.com> wrote:

> Dear all,
>
> I wrote a plugin codes for Hadoop, which calls the interfaces
> in Cpp-built .so library. The plugin codes are written in java,
> so I prepared a JNI class to encapsulate the C interfaces.
>
> The java codes can be executed successfully when I compiled
> it and run it standalone. However, it does not work when I embedded
> in Hadoop. The exception shown out is (found in Hadoop logs):
>
>
> ------------  screen dump  ---------------------
>
> # grep myClass logs/* -r
> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
> thread "JVM Runner jvm_200904261632_0001_m_-1217897050 spawned."
> java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:Exception in
> thread "JVM Runner jvm_200904261632_0001_m_-1887898624 spawned."
> java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.mapred.myClass.myClassfsMount(Ljava/lang/String;)I
> logs/hadoop-hadoop-tasktracker-testbed0.container.org.out:      at
> org.apache.hadoop.mapred.myClass.myClassfsMount(Native Method)
> ...
>
> --------------------------------------------------------
>
> It seems the library can not be loaded in Hadoop. My codes
> (myClass.java) is like:
>
>
> ---------------  myClass.java  ------------------
>
> public class myClass
> {
>
>        public static final Log LOG =
>                    LogFactory.getLog("org.apache.hadoop.mapred.myClass");
>
>
>        public myClass()   {
>
>                try {
>                        //System.setProperty("java.library.path",
> "/usr/local/lib");
>
>                        /* The above line does not work, so I have to
> do something
>                         * like following line.
>                         */
>                        addDir(new String("/usr/local/lib"));
>                        System.loadLibrary("myclass");
>                }
>                catch(UnsatisfiedLinkError e) {
>                        LOG.info( "Cannot load library:\n " +
>                                e.toString() );
>                }
>                catch(IOException ioe) {
>                        LOG.info( "IO error:\n " +
>                                ioe.toString() );
>                }
>
>        }
>
>        /* Since the System.setProperty() does not work, I have to add the
> following
>         * function to force the path is added in java.library.path
>         */
>        public static void addDir(String s) throws IOException {
>
>            try {
>                        Field field =
> ClassLoader.class.getDeclaredField("usr_paths");
>                         field.setAccessible(true);
>                        String[] paths = (String[])field.get(null);
>                        for (int i = 0; i < paths.length; i++) {
>                            if (s.equals(paths[i])) {
>                                return;
>                            }
>                        }
>                        String[] tmp = new String[paths.length+1];
>                        System.arraycopy(paths,0,tmp,0,paths.length);
>                        tmp[paths.length] = s;
>
>                        field.set(null,tmp);
>                    } catch (IllegalAccessException e) {
>                        throw new IOException("Failed to get
> permissions to set library path");
>                    } catch (NoSuchFieldException e) {
>                        throw new IOException("Failed to get field
> handle to set library path");
>            }
>        }
>
>        public native int myClassfsMount(String subsys);
>        public native int myClassfsUmount(String subsys);
>
>
> }
>
> --------------------------------------------------------
>
>
> I don't know what missed in my codes and am wondering whether there are any
> rules in Hadoop I should obey if I want to  achieve my target.
>
> FYI, the myClassfsMount() and myClassfsUmount() will open a socket to call
> services from a daemon. I would better if this design did not cause the
> fail in
> my codes.
>
>
> Any comments?
>
>
> Thanks in advance,
>
> Ian
>



-- 
Alpha Chapters of my book on Hadoop are available
http://www.apress.com/book/view/9781430219422