You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Aastha Mehta <aa...@gmail.com> on 2011/08/01 06:04:42 UTC

Compiling hadoop native libraries

Hello,

I am trying to run fuse_dfs_wrapper.sh from
hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get the
following error:
./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open
shared object file: No such file or directory

I searched on the net and found a response to a similar query here -
https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1

My hadoop package contains the native files in
hadoop-0.20.2/lib/native/Linux-amd64-64/

I followed to this link -
http://hadoop.apache.org/common/docs/current/native_libraries.html to
understand the steps to build hadoop native libraries.

I have a small query regarding the building step. On the above link, it is
mentioned -

"Once you installed the prerequisite packages use the standard hadoop
build.xml file and pass along the compile.native flag (set to true) to build
the native hadoop library:

$ ant -Dcompile.native=true <target>

You should see the newly-built library in:

$ build/native/<platform>/lib

where <platform> is a combination of the system-properties: ${os.name
}-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."


Could someone please tell what exactly is <target> in the first step.


Thanks and regards,

Aastha.





-- 
Aastha Mehta
B.E. (Hons.) Computer Science
BITS Pilani
E-mail: aasthakm@gmail.com

Re: Compiling hadoop native libraries

Posted by Aastha Mehta <aa...@gmail.com>.
I built libhdfs again. I still get the error. Please help me.
Thanks for your time.

Regards,
Aastha.

On 1 August 2011 12:02, Aastha Mehta <aa...@gmail.com> wrote:

> I did run this command earlier. Does it need to be run after compiling the
> native library?
>
> Thanks,
> Aastha.
>
>
> On 1 August 2011 12:01, Eli Collins <el...@cloudera.com> wrote:
>
>> You haven't build libhdfs.  You can do that with  ant
>> compile-c++-libhdfs -Dcompile.c++=true
>>
>> On Sun, Jul 31, 2011 at 10:26 PM, Aastha Mehta <aa...@gmail.com>
>> wrote:
>> > The command works correctly. But I still get the error for running the
>> > fuse_dfs_wrapper.sh script:
>> >
>> > ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot
>> open
>> > shared object file: No such file or directory
>> >
>> > Aastha.
>> >
>> > On 1 August 2011 10:03, Arun C Murthy <ac...@hortonworks.com> wrote:
>> >
>> >> Run the following command:
>> >>
>> >> $ ant -Dcompile.native=true package
>> >>
>> >> Arun
>> >>
>> >> On Jul 31, 2011, at 9:20 PM, Aastha Mehta wrote:
>> >>
>> >> > Hi Arun,
>> >> >
>> >> > Thanks for the prompt reply. I am not sure, I understood you
>> correctly.
>> >> > Compile/binary/tar of what? The native files? The
>> >> lib/native/Linux-amd64-64/
>> >> > contains following files:
>> >> > libhadoop.a
>> >> > libhadoop.la
>> >> > libhadoop.so
>> >> > libhadoop.so.1
>> >> > libhadoop.so.1.0.0
>> >> >
>> >> > This directory is present in the package itself. So, should I make a
>> tar
>> >> of
>> >> > it and then provide it? I tried the following, but it failed:
>> >> > ant -Dcompile.native=true
>> >> > $HADOOP_HOME/lib/native/Linux-amd64-64/libhadoop.so
>> >> >
>> >> > The error I got is - "Target  lib/native/Linux-amd64-64/libhadoop.so
>> does
>> >> > not exist in the project Hadoop".
>> >> >
>> >> > Thanks,
>> >> > Aastha.
>> >> >
>> >> > On 1 August 2011 09:44, Arun Murthy <ac...@hortonworks.com> wrote:
>> >> >
>> >> >> <target> could be compile or binary or tar.
>> >> >>
>> >> >> Arun
>> >> >>
>> >> >> Sent from my iPhone
>> >> >>
>> >> >> On Jul 31, 2011, at 9:05 PM, Aastha Mehta <aa...@gmail.com>
>> wrote:
>> >> >>
>> >> >>> Hello,
>> >> >>>
>> >> >>> I am trying to run fuse_dfs_wrapper.sh from
>> >> >>> hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get
>> the
>> >> >>> following error:
>> >> >>> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0:
>> cannot
>> >> >> open
>> >> >>> shared object file: No such file or directory
>> >> >>>
>> >> >>> I searched on the net and found a response to a similar query here
>> -
>> >> >>>
>> >> >>
>> >>
>> https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1
>> >> >>>
>> >> >>> My hadoop package contains the native files in
>> >> >>> hadoop-0.20.2/lib/native/Linux-amd64-64/
>> >> >>>
>> >> >>> I followed to this link -
>> >> >>> http://hadoop.apache.org/common/docs/current/native_libraries.htmlto
>> >> >>> understand the steps to build hadoop native libraries.
>> >> >>>
>> >> >>> I have a small query regarding the building step. On the above
>> link, it
>> >> >> is
>> >> >>> mentioned -
>> >> >>>
>> >> >>> "Once you installed the prerequisite packages use the standard
>> hadoop
>> >> >>> build.xml file and pass along the compile.native flag (set to true)
>> to
>> >> >> build
>> >> >>> the native hadoop library:
>> >> >>>
>> >> >>> $ ant -Dcompile.native=true <target>
>> >> >>>
>> >> >>> You should see the newly-built library in:
>> >> >>>
>> >> >>> $ build/native/<platform>/lib
>> >> >>>
>> >> >>> where <platform> is a combination of the system-properties: ${
>> os.name
>> >> >>> }-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."
>> >> >>>
>> >> >>>
>> >> >>> Could someone please tell what exactly is <target> in the first
>> step.
>> >> >>>
>> >> >>>
>> >> >>> Thanks and regards,
>> >> >>>
>> >> >>> Aastha.
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>> --
>> >> >>> Aastha Mehta
>> >> >>> B.E. (Hons.) Computer Science
>> >> >>> BITS Pilani
>> >> >>> E-mail: aasthakm@gmail.com
>> >> >>
>> >> >
>> >> >
>> >> >
>> >> > --
>> >> > Aastha Mehta
>> >> > B.E. (Hons.) Computer Science
>> >> > BITS Pilani
>> >> > E-mail: aasthakm@gmail.com
>> >>
>> >>
>> >
>> >
>> > --
>> > Aastha Mehta
>> > B.E. (Hons.) Computer Science
>> > BITS Pilani
>> > E-mail: aasthakm@gmail.com
>> >
>>
>
>
>
> --
> Aastha Mehta
> B.E. (Hons.) Computer Science
> BITS Pilani
> E-mail: aasthakm@gmail.com
>
>
>


-- 
Aastha Mehta
B.E. (Hons.) Computer Science
BITS Pilani
E-mail: aasthakm@gmail.com

Re: Compiling hadoop native libraries

Posted by Aastha Mehta <aa...@gmail.com>.
My bad! LD_LIBRARY_PATH was wrong in the fuse_dfs_wrapper. It is working
now.
Thanks a lot for the help again.

Regards,
Aastha.

On 1 August 2011 12:08, Eli Collins <el...@cloudera.com> wrote:

> If libhdfs.so is in your build directory then it sounds like it is not
> in your LD_LIBRARY_PATH. Double check fuse_dfs_wrapper.sh to make sure
> it's setting it correctly.
>
> Btw here's how I call the wrapper..
> https://github.com/elicollins/hadoop-dev/blob/master/bin/fuse-mount-hdfs
>
> On Sun, Jul 31, 2011 at 11:32 PM, Aastha Mehta <aa...@gmail.com> wrote:
> > I did run this command earlier. Does it need to be run after compiling
> the
> > native library?
> >
> > Thanks,
> > Aastha.
> >
> > On 1 August 2011 12:01, Eli Collins <el...@cloudera.com> wrote:
> >
> >> You haven't build libhdfs.  You can do that with  ant
> >> compile-c++-libhdfs -Dcompile.c++=true
> >>
> >> On Sun, Jul 31, 2011 at 10:26 PM, Aastha Mehta <aa...@gmail.com>
> wrote:
> >> > The command works correctly. But I still get the error for running the
> >> > fuse_dfs_wrapper.sh script:
> >> >
> >> > ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot
> >> open
> >> > shared object file: No such file or directory
> >> >
> >> > Aastha.
> >> >
> >> > On 1 August 2011 10:03, Arun C Murthy <ac...@hortonworks.com> wrote:
> >> >
> >> >> Run the following command:
> >> >>
> >> >> $ ant -Dcompile.native=true package
> >> >>
> >> >> Arun
> >> >>
> >> >> On Jul 31, 2011, at 9:20 PM, Aastha Mehta wrote:
> >> >>
> >> >> > Hi Arun,
> >> >> >
> >> >> > Thanks for the prompt reply. I am not sure, I understood you
> >> correctly.
> >> >> > Compile/binary/tar of what? The native files? The
> >> >> lib/native/Linux-amd64-64/
> >> >> > contains following files:
> >> >> > libhadoop.a
> >> >> > libhadoop.la
> >> >> > libhadoop.so
> >> >> > libhadoop.so.1
> >> >> > libhadoop.so.1.0.0
> >> >> >
> >> >> > This directory is present in the package itself. So, should I make
> a
> >> tar
> >> >> of
> >> >> > it and then provide it? I tried the following, but it failed:
> >> >> > ant -Dcompile.native=true
> >> >> > $HADOOP_HOME/lib/native/Linux-amd64-64/libhadoop.so
> >> >> >
> >> >> > The error I got is - "Target
>  lib/native/Linux-amd64-64/libhadoop.so
> >> does
> >> >> > not exist in the project Hadoop".
> >> >> >
> >> >> > Thanks,
> >> >> > Aastha.
> >> >> >
> >> >> > On 1 August 2011 09:44, Arun Murthy <ac...@hortonworks.com> wrote:
> >> >> >
> >> >> >> <target> could be compile or binary or tar.
> >> >> >>
> >> >> >> Arun
> >> >> >>
> >> >> >> Sent from my iPhone
> >> >> >>
> >> >> >> On Jul 31, 2011, at 9:05 PM, Aastha Mehta <aa...@gmail.com>
> >> wrote:
> >> >> >>
> >> >> >>> Hello,
> >> >> >>>
> >> >> >>> I am trying to run fuse_dfs_wrapper.sh from
> >> >> >>> hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get
> >> the
> >> >> >>> following error:
> >> >> >>> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0:
> >> cannot
> >> >> >> open
> >> >> >>> shared object file: No such file or directory
> >> >> >>>
> >> >> >>> I searched on the net and found a response to a similar query
> here -
> >> >> >>>
> >> >> >>
> >> >>
> >>
> https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1
> >> >> >>>
> >> >> >>> My hadoop package contains the native files in
> >> >> >>> hadoop-0.20.2/lib/native/Linux-amd64-64/
> >> >> >>>
> >> >> >>> I followed to this link -
> >> >> >>>
> http://hadoop.apache.org/common/docs/current/native_libraries.htmlto
> >> >> >>> understand the steps to build hadoop native libraries.
> >> >> >>>
> >> >> >>> I have a small query regarding the building step. On the above
> link,
> >> it
> >> >> >> is
> >> >> >>> mentioned -
> >> >> >>>
> >> >> >>> "Once you installed the prerequisite packages use the standard
> >> hadoop
> >> >> >>> build.xml file and pass along the compile.native flag (set to
> true)
> >> to
> >> >> >> build
> >> >> >>> the native hadoop library:
> >> >> >>>
> >> >> >>> $ ant -Dcompile.native=true <target>
> >> >> >>>
> >> >> >>> You should see the newly-built library in:
> >> >> >>>
> >> >> >>> $ build/native/<platform>/lib
> >> >> >>>
> >> >> >>> where <platform> is a combination of the system-properties: ${
> >> os.name
> >> >> >>> }-${os.arch}-${sun.arch.data.model} (for example,
> Linux-i386-32)."
> >> >> >>>
> >> >> >>>
> >> >> >>> Could someone please tell what exactly is <target> in the first
> >> step.
> >> >> >>>
> >> >> >>>
> >> >> >>> Thanks and regards,
> >> >> >>>
> >> >> >>> Aastha.
> >> >> >>>
> >> >> >>>
> >> >> >>>
> >> >> >>>
> >> >> >>>
> >> >> >>> --
> >> >> >>> Aastha Mehta
> >> >> >>> B.E. (Hons.) Computer Science
> >> >> >>> BITS Pilani
> >> >> >>> E-mail: aasthakm@gmail.com
> >> >> >>
> >> >> >
> >> >> >
> >> >> >
> >> >> > --
> >> >> > Aastha Mehta
> >> >> > B.E. (Hons.) Computer Science
> >> >> > BITS Pilani
> >> >> > E-mail: aasthakm@gmail.com
> >> >>
> >> >>
> >> >
> >> >
> >> > --
> >> > Aastha Mehta
> >> > B.E. (Hons.) Computer Science
> >> > BITS Pilani
> >> > E-mail: aasthakm@gmail.com
> >> >
> >>
> >
> >
> >
> > --
> > Aastha Mehta
> > B.E. (Hons.) Computer Science
> > BITS Pilani
> > E-mail: aasthakm@gmail.com
> >
>



-- 
Aastha Mehta
B.E. (Hons.) Computer Science
BITS Pilani
E-mail: aasthakm@gmail.com

Re: Compiling hadoop native libraries

Posted by Eli Collins <el...@cloudera.com>.
If libhdfs.so is in your build directory then it sounds like it is not
in your LD_LIBRARY_PATH. Double check fuse_dfs_wrapper.sh to make sure
it's setting it correctly.

Btw here's how I call the wrapper..
https://github.com/elicollins/hadoop-dev/blob/master/bin/fuse-mount-hdfs

On Sun, Jul 31, 2011 at 11:32 PM, Aastha Mehta <aa...@gmail.com> wrote:
> I did run this command earlier. Does it need to be run after compiling the
> native library?
>
> Thanks,
> Aastha.
>
> On 1 August 2011 12:01, Eli Collins <el...@cloudera.com> wrote:
>
>> You haven't build libhdfs.  You can do that with  ant
>> compile-c++-libhdfs -Dcompile.c++=true
>>
>> On Sun, Jul 31, 2011 at 10:26 PM, Aastha Mehta <aa...@gmail.com> wrote:
>> > The command works correctly. But I still get the error for running the
>> > fuse_dfs_wrapper.sh script:
>> >
>> > ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot
>> open
>> > shared object file: No such file or directory
>> >
>> > Aastha.
>> >
>> > On 1 August 2011 10:03, Arun C Murthy <ac...@hortonworks.com> wrote:
>> >
>> >> Run the following command:
>> >>
>> >> $ ant -Dcompile.native=true package
>> >>
>> >> Arun
>> >>
>> >> On Jul 31, 2011, at 9:20 PM, Aastha Mehta wrote:
>> >>
>> >> > Hi Arun,
>> >> >
>> >> > Thanks for the prompt reply. I am not sure, I understood you
>> correctly.
>> >> > Compile/binary/tar of what? The native files? The
>> >> lib/native/Linux-amd64-64/
>> >> > contains following files:
>> >> > libhadoop.a
>> >> > libhadoop.la
>> >> > libhadoop.so
>> >> > libhadoop.so.1
>> >> > libhadoop.so.1.0.0
>> >> >
>> >> > This directory is present in the package itself. So, should I make a
>> tar
>> >> of
>> >> > it and then provide it? I tried the following, but it failed:
>> >> > ant -Dcompile.native=true
>> >> > $HADOOP_HOME/lib/native/Linux-amd64-64/libhadoop.so
>> >> >
>> >> > The error I got is - "Target  lib/native/Linux-amd64-64/libhadoop.so
>> does
>> >> > not exist in the project Hadoop".
>> >> >
>> >> > Thanks,
>> >> > Aastha.
>> >> >
>> >> > On 1 August 2011 09:44, Arun Murthy <ac...@hortonworks.com> wrote:
>> >> >
>> >> >> <target> could be compile or binary or tar.
>> >> >>
>> >> >> Arun
>> >> >>
>> >> >> Sent from my iPhone
>> >> >>
>> >> >> On Jul 31, 2011, at 9:05 PM, Aastha Mehta <aa...@gmail.com>
>> wrote:
>> >> >>
>> >> >>> Hello,
>> >> >>>
>> >> >>> I am trying to run fuse_dfs_wrapper.sh from
>> >> >>> hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get
>> the
>> >> >>> following error:
>> >> >>> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0:
>> cannot
>> >> >> open
>> >> >>> shared object file: No such file or directory
>> >> >>>
>> >> >>> I searched on the net and found a response to a similar query here -
>> >> >>>
>> >> >>
>> >>
>> https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1
>> >> >>>
>> >> >>> My hadoop package contains the native files in
>> >> >>> hadoop-0.20.2/lib/native/Linux-amd64-64/
>> >> >>>
>> >> >>> I followed to this link -
>> >> >>> http://hadoop.apache.org/common/docs/current/native_libraries.htmlto
>> >> >>> understand the steps to build hadoop native libraries.
>> >> >>>
>> >> >>> I have a small query regarding the building step. On the above link,
>> it
>> >> >> is
>> >> >>> mentioned -
>> >> >>>
>> >> >>> "Once you installed the prerequisite packages use the standard
>> hadoop
>> >> >>> build.xml file and pass along the compile.native flag (set to true)
>> to
>> >> >> build
>> >> >>> the native hadoop library:
>> >> >>>
>> >> >>> $ ant -Dcompile.native=true <target>
>> >> >>>
>> >> >>> You should see the newly-built library in:
>> >> >>>
>> >> >>> $ build/native/<platform>/lib
>> >> >>>
>> >> >>> where <platform> is a combination of the system-properties: ${
>> os.name
>> >> >>> }-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."
>> >> >>>
>> >> >>>
>> >> >>> Could someone please tell what exactly is <target> in the first
>> step.
>> >> >>>
>> >> >>>
>> >> >>> Thanks and regards,
>> >> >>>
>> >> >>> Aastha.
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>> --
>> >> >>> Aastha Mehta
>> >> >>> B.E. (Hons.) Computer Science
>> >> >>> BITS Pilani
>> >> >>> E-mail: aasthakm@gmail.com
>> >> >>
>> >> >
>> >> >
>> >> >
>> >> > --
>> >> > Aastha Mehta
>> >> > B.E. (Hons.) Computer Science
>> >> > BITS Pilani
>> >> > E-mail: aasthakm@gmail.com
>> >>
>> >>
>> >
>> >
>> > --
>> > Aastha Mehta
>> > B.E. (Hons.) Computer Science
>> > BITS Pilani
>> > E-mail: aasthakm@gmail.com
>> >
>>
>
>
>
> --
> Aastha Mehta
> B.E. (Hons.) Computer Science
> BITS Pilani
> E-mail: aasthakm@gmail.com
>

Re: Compiling hadoop native libraries

Posted by Aastha Mehta <aa...@gmail.com>.
I did run this command earlier. Does it need to be run after compiling the
native library?

Thanks,
Aastha.

On 1 August 2011 12:01, Eli Collins <el...@cloudera.com> wrote:

> You haven't build libhdfs.  You can do that with  ant
> compile-c++-libhdfs -Dcompile.c++=true
>
> On Sun, Jul 31, 2011 at 10:26 PM, Aastha Mehta <aa...@gmail.com> wrote:
> > The command works correctly. But I still get the error for running the
> > fuse_dfs_wrapper.sh script:
> >
> > ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot
> open
> > shared object file: No such file or directory
> >
> > Aastha.
> >
> > On 1 August 2011 10:03, Arun C Murthy <ac...@hortonworks.com> wrote:
> >
> >> Run the following command:
> >>
> >> $ ant -Dcompile.native=true package
> >>
> >> Arun
> >>
> >> On Jul 31, 2011, at 9:20 PM, Aastha Mehta wrote:
> >>
> >> > Hi Arun,
> >> >
> >> > Thanks for the prompt reply. I am not sure, I understood you
> correctly.
> >> > Compile/binary/tar of what? The native files? The
> >> lib/native/Linux-amd64-64/
> >> > contains following files:
> >> > libhadoop.a
> >> > libhadoop.la
> >> > libhadoop.so
> >> > libhadoop.so.1
> >> > libhadoop.so.1.0.0
> >> >
> >> > This directory is present in the package itself. So, should I make a
> tar
> >> of
> >> > it and then provide it? I tried the following, but it failed:
> >> > ant -Dcompile.native=true
> >> > $HADOOP_HOME/lib/native/Linux-amd64-64/libhadoop.so
> >> >
> >> > The error I got is - "Target  lib/native/Linux-amd64-64/libhadoop.so
> does
> >> > not exist in the project Hadoop".
> >> >
> >> > Thanks,
> >> > Aastha.
> >> >
> >> > On 1 August 2011 09:44, Arun Murthy <ac...@hortonworks.com> wrote:
> >> >
> >> >> <target> could be compile or binary or tar.
> >> >>
> >> >> Arun
> >> >>
> >> >> Sent from my iPhone
> >> >>
> >> >> On Jul 31, 2011, at 9:05 PM, Aastha Mehta <aa...@gmail.com>
> wrote:
> >> >>
> >> >>> Hello,
> >> >>>
> >> >>> I am trying to run fuse_dfs_wrapper.sh from
> >> >>> hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get
> the
> >> >>> following error:
> >> >>> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0:
> cannot
> >> >> open
> >> >>> shared object file: No such file or directory
> >> >>>
> >> >>> I searched on the net and found a response to a similar query here -
> >> >>>
> >> >>
> >>
> https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1
> >> >>>
> >> >>> My hadoop package contains the native files in
> >> >>> hadoop-0.20.2/lib/native/Linux-amd64-64/
> >> >>>
> >> >>> I followed to this link -
> >> >>> http://hadoop.apache.org/common/docs/current/native_libraries.htmlto
> >> >>> understand the steps to build hadoop native libraries.
> >> >>>
> >> >>> I have a small query regarding the building step. On the above link,
> it
> >> >> is
> >> >>> mentioned -
> >> >>>
> >> >>> "Once you installed the prerequisite packages use the standard
> hadoop
> >> >>> build.xml file and pass along the compile.native flag (set to true)
> to
> >> >> build
> >> >>> the native hadoop library:
> >> >>>
> >> >>> $ ant -Dcompile.native=true <target>
> >> >>>
> >> >>> You should see the newly-built library in:
> >> >>>
> >> >>> $ build/native/<platform>/lib
> >> >>>
> >> >>> where <platform> is a combination of the system-properties: ${
> os.name
> >> >>> }-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."
> >> >>>
> >> >>>
> >> >>> Could someone please tell what exactly is <target> in the first
> step.
> >> >>>
> >> >>>
> >> >>> Thanks and regards,
> >> >>>
> >> >>> Aastha.
> >> >>>
> >> >>>
> >> >>>
> >> >>>
> >> >>>
> >> >>> --
> >> >>> Aastha Mehta
> >> >>> B.E. (Hons.) Computer Science
> >> >>> BITS Pilani
> >> >>> E-mail: aasthakm@gmail.com
> >> >>
> >> >
> >> >
> >> >
> >> > --
> >> > Aastha Mehta
> >> > B.E. (Hons.) Computer Science
> >> > BITS Pilani
> >> > E-mail: aasthakm@gmail.com
> >>
> >>
> >
> >
> > --
> > Aastha Mehta
> > B.E. (Hons.) Computer Science
> > BITS Pilani
> > E-mail: aasthakm@gmail.com
> >
>



-- 
Aastha Mehta
B.E. (Hons.) Computer Science
BITS Pilani
E-mail: aasthakm@gmail.com

Re: Compiling hadoop native libraries

Posted by Eli Collins <el...@cloudera.com>.
You haven't build libhdfs.  You can do that with  ant
compile-c++-libhdfs -Dcompile.c++=true

On Sun, Jul 31, 2011 at 10:26 PM, Aastha Mehta <aa...@gmail.com> wrote:
> The command works correctly. But I still get the error for running the
> fuse_dfs_wrapper.sh script:
>
> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open
> shared object file: No such file or directory
>
> Aastha.
>
> On 1 August 2011 10:03, Arun C Murthy <ac...@hortonworks.com> wrote:
>
>> Run the following command:
>>
>> $ ant -Dcompile.native=true package
>>
>> Arun
>>
>> On Jul 31, 2011, at 9:20 PM, Aastha Mehta wrote:
>>
>> > Hi Arun,
>> >
>> > Thanks for the prompt reply. I am not sure, I understood you correctly.
>> > Compile/binary/tar of what? The native files? The
>> lib/native/Linux-amd64-64/
>> > contains following files:
>> > libhadoop.a
>> > libhadoop.la
>> > libhadoop.so
>> > libhadoop.so.1
>> > libhadoop.so.1.0.0
>> >
>> > This directory is present in the package itself. So, should I make a tar
>> of
>> > it and then provide it? I tried the following, but it failed:
>> > ant -Dcompile.native=true
>> > $HADOOP_HOME/lib/native/Linux-amd64-64/libhadoop.so
>> >
>> > The error I got is - "Target  lib/native/Linux-amd64-64/libhadoop.so does
>> > not exist in the project Hadoop".
>> >
>> > Thanks,
>> > Aastha.
>> >
>> > On 1 August 2011 09:44, Arun Murthy <ac...@hortonworks.com> wrote:
>> >
>> >> <target> could be compile or binary or tar.
>> >>
>> >> Arun
>> >>
>> >> Sent from my iPhone
>> >>
>> >> On Jul 31, 2011, at 9:05 PM, Aastha Mehta <aa...@gmail.com> wrote:
>> >>
>> >>> Hello,
>> >>>
>> >>> I am trying to run fuse_dfs_wrapper.sh from
>> >>> hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get the
>> >>> following error:
>> >>> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot
>> >> open
>> >>> shared object file: No such file or directory
>> >>>
>> >>> I searched on the net and found a response to a similar query here -
>> >>>
>> >>
>> https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1
>> >>>
>> >>> My hadoop package contains the native files in
>> >>> hadoop-0.20.2/lib/native/Linux-amd64-64/
>> >>>
>> >>> I followed to this link -
>> >>> http://hadoop.apache.org/common/docs/current/native_libraries.html to
>> >>> understand the steps to build hadoop native libraries.
>> >>>
>> >>> I have a small query regarding the building step. On the above link, it
>> >> is
>> >>> mentioned -
>> >>>
>> >>> "Once you installed the prerequisite packages use the standard hadoop
>> >>> build.xml file and pass along the compile.native flag (set to true) to
>> >> build
>> >>> the native hadoop library:
>> >>>
>> >>> $ ant -Dcompile.native=true <target>
>> >>>
>> >>> You should see the newly-built library in:
>> >>>
>> >>> $ build/native/<platform>/lib
>> >>>
>> >>> where <platform> is a combination of the system-properties: ${os.name
>> >>> }-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."
>> >>>
>> >>>
>> >>> Could someone please tell what exactly is <target> in the first step.
>> >>>
>> >>>
>> >>> Thanks and regards,
>> >>>
>> >>> Aastha.
>> >>>
>> >>>
>> >>>
>> >>>
>> >>>
>> >>> --
>> >>> Aastha Mehta
>> >>> B.E. (Hons.) Computer Science
>> >>> BITS Pilani
>> >>> E-mail: aasthakm@gmail.com
>> >>
>> >
>> >
>> >
>> > --
>> > Aastha Mehta
>> > B.E. (Hons.) Computer Science
>> > BITS Pilani
>> > E-mail: aasthakm@gmail.com
>>
>>
>
>
> --
> Aastha Mehta
> B.E. (Hons.) Computer Science
> BITS Pilani
> E-mail: aasthakm@gmail.com
>

Re: Compiling hadoop native libraries

Posted by Aastha Mehta <aa...@gmail.com>.
The command works correctly. But I still get the error for running the
fuse_dfs_wrapper.sh script:

./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open
shared object file: No such file or directory

Aastha.

On 1 August 2011 10:03, Arun C Murthy <ac...@hortonworks.com> wrote:

> Run the following command:
>
> $ ant -Dcompile.native=true package
>
> Arun
>
> On Jul 31, 2011, at 9:20 PM, Aastha Mehta wrote:
>
> > Hi Arun,
> >
> > Thanks for the prompt reply. I am not sure, I understood you correctly.
> > Compile/binary/tar of what? The native files? The
> lib/native/Linux-amd64-64/
> > contains following files:
> > libhadoop.a
> > libhadoop.la
> > libhadoop.so
> > libhadoop.so.1
> > libhadoop.so.1.0.0
> >
> > This directory is present in the package itself. So, should I make a tar
> of
> > it and then provide it? I tried the following, but it failed:
> > ant -Dcompile.native=true
> > $HADOOP_HOME/lib/native/Linux-amd64-64/libhadoop.so
> >
> > The error I got is - "Target  lib/native/Linux-amd64-64/libhadoop.so does
> > not exist in the project Hadoop".
> >
> > Thanks,
> > Aastha.
> >
> > On 1 August 2011 09:44, Arun Murthy <ac...@hortonworks.com> wrote:
> >
> >> <target> could be compile or binary or tar.
> >>
> >> Arun
> >>
> >> Sent from my iPhone
> >>
> >> On Jul 31, 2011, at 9:05 PM, Aastha Mehta <aa...@gmail.com> wrote:
> >>
> >>> Hello,
> >>>
> >>> I am trying to run fuse_dfs_wrapper.sh from
> >>> hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get the
> >>> following error:
> >>> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot
> >> open
> >>> shared object file: No such file or directory
> >>>
> >>> I searched on the net and found a response to a similar query here -
> >>>
> >>
> https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1
> >>>
> >>> My hadoop package contains the native files in
> >>> hadoop-0.20.2/lib/native/Linux-amd64-64/
> >>>
> >>> I followed to this link -
> >>> http://hadoop.apache.org/common/docs/current/native_libraries.html to
> >>> understand the steps to build hadoop native libraries.
> >>>
> >>> I have a small query regarding the building step. On the above link, it
> >> is
> >>> mentioned -
> >>>
> >>> "Once you installed the prerequisite packages use the standard hadoop
> >>> build.xml file and pass along the compile.native flag (set to true) to
> >> build
> >>> the native hadoop library:
> >>>
> >>> $ ant -Dcompile.native=true <target>
> >>>
> >>> You should see the newly-built library in:
> >>>
> >>> $ build/native/<platform>/lib
> >>>
> >>> where <platform> is a combination of the system-properties: ${os.name
> >>> }-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."
> >>>
> >>>
> >>> Could someone please tell what exactly is <target> in the first step.
> >>>
> >>>
> >>> Thanks and regards,
> >>>
> >>> Aastha.
> >>>
> >>>
> >>>
> >>>
> >>>
> >>> --
> >>> Aastha Mehta
> >>> B.E. (Hons.) Computer Science
> >>> BITS Pilani
> >>> E-mail: aasthakm@gmail.com
> >>
> >
> >
> >
> > --
> > Aastha Mehta
> > B.E. (Hons.) Computer Science
> > BITS Pilani
> > E-mail: aasthakm@gmail.com
>
>


-- 
Aastha Mehta
B.E. (Hons.) Computer Science
BITS Pilani
E-mail: aasthakm@gmail.com

Re: Compiling hadoop native libraries

Posted by Arun C Murthy <ac...@hortonworks.com>.
Run the following command:

$ ant -Dcompile.native=true package

Arun

On Jul 31, 2011, at 9:20 PM, Aastha Mehta wrote:

> Hi Arun,
> 
> Thanks for the prompt reply. I am not sure, I understood you correctly.
> Compile/binary/tar of what? The native files? The lib/native/Linux-amd64-64/
> contains following files:
> libhadoop.a
> libhadoop.la
> libhadoop.so
> libhadoop.so.1
> libhadoop.so.1.0.0
> 
> This directory is present in the package itself. So, should I make a tar of
> it and then provide it? I tried the following, but it failed:
> ant -Dcompile.native=true
> $HADOOP_HOME/lib/native/Linux-amd64-64/libhadoop.so
> 
> The error I got is - "Target  lib/native/Linux-amd64-64/libhadoop.so does
> not exist in the project Hadoop".
> 
> Thanks,
> Aastha.
> 
> On 1 August 2011 09:44, Arun Murthy <ac...@hortonworks.com> wrote:
> 
>> <target> could be compile or binary or tar.
>> 
>> Arun
>> 
>> Sent from my iPhone
>> 
>> On Jul 31, 2011, at 9:05 PM, Aastha Mehta <aa...@gmail.com> wrote:
>> 
>>> Hello,
>>> 
>>> I am trying to run fuse_dfs_wrapper.sh from
>>> hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get the
>>> following error:
>>> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot
>> open
>>> shared object file: No such file or directory
>>> 
>>> I searched on the net and found a response to a similar query here -
>>> 
>> https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1
>>> 
>>> My hadoop package contains the native files in
>>> hadoop-0.20.2/lib/native/Linux-amd64-64/
>>> 
>>> I followed to this link -
>>> http://hadoop.apache.org/common/docs/current/native_libraries.html to
>>> understand the steps to build hadoop native libraries.
>>> 
>>> I have a small query regarding the building step. On the above link, it
>> is
>>> mentioned -
>>> 
>>> "Once you installed the prerequisite packages use the standard hadoop
>>> build.xml file and pass along the compile.native flag (set to true) to
>> build
>>> the native hadoop library:
>>> 
>>> $ ant -Dcompile.native=true <target>
>>> 
>>> You should see the newly-built library in:
>>> 
>>> $ build/native/<platform>/lib
>>> 
>>> where <platform> is a combination of the system-properties: ${os.name
>>> }-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."
>>> 
>>> 
>>> Could someone please tell what exactly is <target> in the first step.
>>> 
>>> 
>>> Thanks and regards,
>>> 
>>> Aastha.
>>> 
>>> 
>>> 
>>> 
>>> 
>>> --
>>> Aastha Mehta
>>> B.E. (Hons.) Computer Science
>>> BITS Pilani
>>> E-mail: aasthakm@gmail.com
>> 
> 
> 
> 
> -- 
> Aastha Mehta
> B.E. (Hons.) Computer Science
> BITS Pilani
> E-mail: aasthakm@gmail.com


Re: Compiling hadoop native libraries

Posted by Aastha Mehta <aa...@gmail.com>.
Hi Arun,

Thanks for the prompt reply. I am not sure, I understood you correctly.
Compile/binary/tar of what? The native files? The lib/native/Linux-amd64-64/
contains following files:
libhadoop.a
libhadoop.la
libhadoop.so
libhadoop.so.1
libhadoop.so.1.0.0

This directory is present in the package itself. So, should I make a tar of
it and then provide it? I tried the following, but it failed:
ant -Dcompile.native=true
$HADOOP_HOME/lib/native/Linux-amd64-64/libhadoop.so

The error I got is - "Target  lib/native/Linux-amd64-64/libhadoop.so does
not exist in the project Hadoop".

Thanks,
Aastha.

On 1 August 2011 09:44, Arun Murthy <ac...@hortonworks.com> wrote:

> <target> could be compile or binary or tar.
>
> Arun
>
> Sent from my iPhone
>
> On Jul 31, 2011, at 9:05 PM, Aastha Mehta <aa...@gmail.com> wrote:
>
> > Hello,
> >
> > I am trying to run fuse_dfs_wrapper.sh from
> > hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get the
> > following error:
> > ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot
> open
> > shared object file: No such file or directory
> >
> > I searched on the net and found a response to a similar query here -
> >
> https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1
> >
> > My hadoop package contains the native files in
> > hadoop-0.20.2/lib/native/Linux-amd64-64/
> >
> > I followed to this link -
> > http://hadoop.apache.org/common/docs/current/native_libraries.html to
> > understand the steps to build hadoop native libraries.
> >
> > I have a small query regarding the building step. On the above link, it
> is
> > mentioned -
> >
> > "Once you installed the prerequisite packages use the standard hadoop
> > build.xml file and pass along the compile.native flag (set to true) to
> build
> > the native hadoop library:
> >
> > $ ant -Dcompile.native=true <target>
> >
> > You should see the newly-built library in:
> >
> > $ build/native/<platform>/lib
> >
> > where <platform> is a combination of the system-properties: ${os.name
> > }-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."
> >
> >
> > Could someone please tell what exactly is <target> in the first step.
> >
> >
> > Thanks and regards,
> >
> > Aastha.
> >
> >
> >
> >
> >
> > --
> > Aastha Mehta
> > B.E. (Hons.) Computer Science
> > BITS Pilani
> > E-mail: aasthakm@gmail.com
>



-- 
Aastha Mehta
B.E. (Hons.) Computer Science
BITS Pilani
E-mail: aasthakm@gmail.com

Re: Compiling hadoop native libraries

Posted by Arun Murthy <ac...@hortonworks.com>.
<target> could be compile or binary or tar.

Arun

Sent from my iPhone

On Jul 31, 2011, at 9:05 PM, Aastha Mehta <aa...@gmail.com> wrote:

> Hello,
>
> I am trying to run fuse_dfs_wrapper.sh from
> hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get the
> following error:
> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open
> shared object file: No such file or directory
>
> I searched on the net and found a response to a similar query here -
> https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1
>
> My hadoop package contains the native files in
> hadoop-0.20.2/lib/native/Linux-amd64-64/
>
> I followed to this link -
> http://hadoop.apache.org/common/docs/current/native_libraries.html to
> understand the steps to build hadoop native libraries.
>
> I have a small query regarding the building step. On the above link, it is
> mentioned -
>
> "Once you installed the prerequisite packages use the standard hadoop
> build.xml file and pass along the compile.native flag (set to true) to build
> the native hadoop library:
>
> $ ant -Dcompile.native=true <target>
>
> You should see the newly-built library in:
>
> $ build/native/<platform>/lib
>
> where <platform> is a combination of the system-properties: ${os.name
> }-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."
>
>
> Could someone please tell what exactly is <target> in the first step.
>
>
> Thanks and regards,
>
> Aastha.
>
>
>
>
>
> --
> Aastha Mehta
> B.E. (Hons.) Computer Science
> BITS Pilani
> E-mail: aasthakm@gmail.com