You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Raghavendra K <ra...@gmail.com> on 2008/02/21 12:29:52 UTC
Problem with LibHDFS
Hi,
I am able to get Hadoop running and also able to compile the libhdfs.
But when I run the hdfs_test program it is giving Segmentation Fault.
Just a small program like this
#include "hdfs.h"
int main() {
return(0);
}
and compiled using the command
gcc -ggdb -m32 -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include
-I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include/ hdfs_test.c
-L/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3/libhdfs -lhdfs
-L/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/lib/i386/server -ljvm
-shared -m32 -Wl,-x -o hdfs_test
running hdfs_test gives segmentation fault.
please tell me as to how to fix it.
--
Regards,
Raghavendra K
Re: Problem with LibHDFS
Posted by Raghavendra K <ra...@gmail.com>.
Hi,
Thanks a lot for your reply.
I have added hadoop-core.jar and /conf to CLASSPATH but still I am getting
the same error.
test-libhdfs.sh: line 83: 8396 Segmentation fault (core dumped)
CLASSPATH=$HADOOP_HOME/conf:$CLASSPATH
LD_PRELOAD="$HADOOP_HOME/libhdfs/libhdfs.so" $LIBHDFS_BUILD_DIR/$HDFS_TEST
What should I do?
Is there any way out?
On Fri, Feb 22, 2008 at 11:06 PM, Arun C Murthy <ac...@yahoo-inc.com> wrote:
>
> On Feb 21, 2008, at 3:29 AM, Raghavendra K wrote:
>
> > Hi,
> > I am able to get Hadoop running and also able to compile the
> > libhdfs.
> > But when I run the hdfs_test program it is giving Segmentation Fault.
>
> Unfortunately the documentation for using libhdfs is sparse, our
> apologies.
>
> You'll need to set the CLASSPATH to include your hadoop-core.jar and
> the conf/ directory to run libhdfs since it is a JNI wrapper over the
> HDFS java api.
>
> Please take a look at the 'test-libhdfs' target in the top-level
> build.xml for hints on how to set them...
>
> Arun
>
> > Just a small program like this
> > #include "hdfs.h"
> > int main() {
> > return(0);
> > }
> > and compiled using the command
> > gcc -ggdb -m32 -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/
> > include
> > -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include/
> > hdfs_test.c
> > -L/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3/libhdfs -lhdfs
> > -L/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/lib/i386/server
> > -ljvm
> > -shared -m32 -Wl,-x -o hdfs_test
> > running hdfs_test gives segmentation fault.
> > please tell me as to how to fix it.
> >
> >
> >
> > --
> > Regards,
> > Raghavendra K
>
>
--
Regards,
Raghavendra K
Re: Problem with LibHDFS
Posted by Arun C Murthy <ac...@yahoo-inc.com>.
On Feb 21, 2008, at 3:29 AM, Raghavendra K wrote:
> Hi,
> I am able to get Hadoop running and also able to compile the
> libhdfs.
> But when I run the hdfs_test program it is giving Segmentation Fault.
Unfortunately the documentation for using libhdfs is sparse, our
apologies.
You'll need to set the CLASSPATH to include your hadoop-core.jar and
the conf/ directory to run libhdfs since it is a JNI wrapper over the
HDFS java api.
Please take a look at the 'test-libhdfs' target in the top-level
build.xml for hints on how to set them...
Arun
> Just a small program like this
> #include "hdfs.h"
> int main() {
> return(0);
> }
> and compiled using the command
> gcc -ggdb -m32 -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/
> include
> -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include/
> hdfs_test.c
> -L/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3/libhdfs -lhdfs
> -L/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/lib/i386/server
> -ljvm
> -shared -m32 -Wl,-x -o hdfs_test
> running hdfs_test gives segmentation fault.
> please tell me as to how to fix it.
>
>
>
> --
> Regards,
> Raghavendra K
Re: Problem with LibHDFS
Posted by Raghavendra K <ra...@gmail.com>.
i tried even that and the output is
Program received signal SIGSEGV, Segmentation fault.
0x00000001 in ?? ()
(gdb) bt
#0 0x00000001 in ?? ()
(gdb)
its the same thing.......dont know what to do..
On Thu, Feb 21, 2008 at 10:11 PM, Jaideep Dhok <ja...@gmail.com>
wrote:
> Type 'bt' on the gdb prompt after you get the segfault. It will direct you
> to the line where segfault occurred.
> - Jaideep
> On Thu, Feb 21, 2008 at 9:52 PM, Raghavendra K <ra...@gmail.com>
> wrote:
>
> > When I try with gdb, i receive the following output
> > (gdb) r
> > Starting program:
> > /garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3
> > /src/c++/libhdfs/hdfs_test
> >
> >
> > Program received signal SIGSEGV, Segmentation fault.
> > 0x00000001 in ?? ()
> >
> > Dont know what to make out of it.
> >
> > On Thu, Feb 21, 2008 at 5:06 PM, Miles Osborne <mi...@inf.ed.ac.uk>
> wrote:
> >
> > > Since you are compiling a C(++) program, why not add the -g switch and
> > run
> > > it within gdb: that will tell people which line it crashes at (etc
> etc)
> > >
> > > Miles
> > >
> > > On 21/02/2008, Raghavendra K <ra...@gmail.com> wrote:
> > > >
> > > > Hi,
> > > > I am able to get Hadoop running and also able to compile the
> > libhdfs.
> > > > But when I run the hdfs_test program it is giving Segmentation
> Fault.
> > > > Just a small program like this
> > > > #include "hdfs.h"
> > > > int main() {
> > > > return(0);
> > > > }
> > > > and compiled using the command
> > > > gcc -ggdb -m32
> > > -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include
> > > > -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include/
> > hdfs_test.c
> > > > -L/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3/libhdfs -lhdfs
> > > > -L/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/lib/i386/server
> > > -ljvm
> > > > -shared -m32 -Wl,-x -o hdfs_test
> > > > running hdfs_test gives segmentation fault.
> > > > please tell me as to how to fix it.
> > > >
> > > >
> > > >
> > > > --
> > > > Regards,
> > > >
> > > > Raghavendra K
> > > >
> > >
> > >
> > >
> > > --
> > > The University of Edinburgh is a charitable body, registered in
> > Scotland,
> > > with registration number SC005336.
> > >
> >
> >
> >
> > --
> > Regards,
> > Raghavendra K
> >
>
>
>
> --
> Jaideep Dhok
>
--
Regards,
Raghavendra K
Re: Problem with LibHDFS
Posted by Jaideep Dhok <ja...@gmail.com>.
Type 'bt' on the gdb prompt after you get the segfault. It will direct you
to the line where segfault occurred.
- Jaideep
On Thu, Feb 21, 2008 at 9:52 PM, Raghavendra K <ra...@gmail.com>
wrote:
> When I try with gdb, i receive the following output
> (gdb) r
> Starting program:
> /garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3
> /src/c++/libhdfs/hdfs_test
>
>
> Program received signal SIGSEGV, Segmentation fault.
> 0x00000001 in ?? ()
>
> Dont know what to make out of it.
>
> On Thu, Feb 21, 2008 at 5:06 PM, Miles Osborne <mi...@inf.ed.ac.uk> wrote:
>
> > Since you are compiling a C(++) program, why not add the -g switch and
> run
> > it within gdb: that will tell people which line it crashes at (etc etc)
> >
> > Miles
> >
> > On 21/02/2008, Raghavendra K <ra...@gmail.com> wrote:
> > >
> > > Hi,
> > > I am able to get Hadoop running and also able to compile the
> libhdfs.
> > > But when I run the hdfs_test program it is giving Segmentation Fault.
> > > Just a small program like this
> > > #include "hdfs.h"
> > > int main() {
> > > return(0);
> > > }
> > > and compiled using the command
> > > gcc -ggdb -m32
> > -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include
> > > -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include/
> hdfs_test.c
> > > -L/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3/libhdfs -lhdfs
> > > -L/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/lib/i386/server
> > -ljvm
> > > -shared -m32 -Wl,-x -o hdfs_test
> > > running hdfs_test gives segmentation fault.
> > > please tell me as to how to fix it.
> > >
> > >
> > >
> > > --
> > > Regards,
> > >
> > > Raghavendra K
> > >
> >
> >
> >
> > --
> > The University of Edinburgh is a charitable body, registered in
> Scotland,
> > with registration number SC005336.
> >
>
>
>
> --
> Regards,
> Raghavendra K
>
--
Jaideep Dhok
Re: Problem with LibHDFS
Posted by Raghavendra K <ra...@gmail.com>.
When I try with gdb, i receive the following output
(gdb) r
Starting program:
/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3/src/c++/libhdfs/hdfs_test
Program received signal SIGSEGV, Segmentation fault.
0x00000001 in ?? ()
Dont know what to make out of it.
On Thu, Feb 21, 2008 at 5:06 PM, Miles Osborne <mi...@inf.ed.ac.uk> wrote:
> Since you are compiling a C(++) program, why not add the -g switch and run
> it within gdb: that will tell people which line it crashes at (etc etc)
>
> Miles
>
> On 21/02/2008, Raghavendra K <ra...@gmail.com> wrote:
> >
> > Hi,
> > I am able to get Hadoop running and also able to compile the libhdfs.
> > But when I run the hdfs_test program it is giving Segmentation Fault.
> > Just a small program like this
> > #include "hdfs.h"
> > int main() {
> > return(0);
> > }
> > and compiled using the command
> > gcc -ggdb -m32
> -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include
> > -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include/ hdfs_test.c
> > -L/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3/libhdfs -lhdfs
> > -L/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/lib/i386/server
> -ljvm
> > -shared -m32 -Wl,-x -o hdfs_test
> > running hdfs_test gives segmentation fault.
> > please tell me as to how to fix it.
> >
> >
> >
> > --
> > Regards,
> >
> > Raghavendra K
> >
>
>
>
> --
> The University of Edinburgh is a charitable body, registered in Scotland,
> with registration number SC005336.
>
--
Regards,
Raghavendra K
Re: Problem with LibHDFS
Posted by Miles Osborne <mi...@inf.ed.ac.uk>.
Since you are compiling a C(++) program, why not add the -g switch and run
it within gdb: that will tell people which line it crashes at (etc etc)
Miles
On 21/02/2008, Raghavendra K <ra...@gmail.com> wrote:
>
> Hi,
> I am able to get Hadoop running and also able to compile the libhdfs.
> But when I run the hdfs_test program it is giving Segmentation Fault.
> Just a small program like this
> #include "hdfs.h"
> int main() {
> return(0);
> }
> and compiled using the command
> gcc -ggdb -m32 -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include
> -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include/ hdfs_test.c
> -L/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3/libhdfs -lhdfs
> -L/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/lib/i386/server -ljvm
> -shared -m32 -Wl,-x -o hdfs_test
> running hdfs_test gives segmentation fault.
> please tell me as to how to fix it.
>
>
>
> --
> Regards,
>
> Raghavendra K
>
--
The University of Edinburgh is a charitable body, registered in Scotland,
with registration number SC005336.