You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Arifa Nisar <a-...@u.northwestern.edu> on 2009/01/17 10:19:27 UTC

Problem running unning hdfs_test

Hello all,

 

I am trying to test hdfs_test.c provided with hadoop installation.
libhdfs.so and hdfs_test are built fine after making a few  changes in
$(HADOOP_HOME)/src/c++/libhdfs/Makefile. But when I try to run ./hdfs_test,
I get segmentation fault at 0x0000000000000001

 

Program received signal SIGSEGV, Segmentation fault.

0x0000000000000001 in ?? ()

(gdb) bt

#0  0x0000000000000001 in ?? ()

#1  0x00007fffd0c51af5 in ?? ()

#2  0x0000000000000000 in ?? ()

 

A simple hello world program linked with libdhfs.so also gives the same
error. In CLASSPATH all the jar files in $(HADOOP_HOME),
$(HADOOP_HOME)/conf, $(HADOOP_HOME)/lib,$(JAVA_HOME)/lib are included.
Please help.

 

Thanks,

Arifa.

 


RE: Problem running hdfs_test

Posted by Arifa Nisar <a-...@u.northwestern.edu>.
Thanks a lot for your help. I solved that problem by removing LDFLAGS (containing libjvm.so) from hdfs_test compilation. I added that flag to compile correctly using Makefile but that was the real problem. Only after removing it I was able to run with ant. 

Thanks,
Arifa

-----Original Message-----
From: Rasit OZDAS [mailto:rasitozdas@gmail.com] 
Sent: Friday, January 23, 2009 6:47 AM
To: core-user@hadoop.apache.org
Subject: Re: Problem running hdfs_test

Hi, Arifa

I had to add "LD_LIBRARY_PATH" env. var. to correctly run my example.
But I have no idea if it helps, because my error wasn't a segmentation
fault. I would try it anyway.

LD_LIBRARY_PATH:/usr/JRE/jre1.6.0_11/jre1.6.0_11/lib:/usr/JRE/jre1.6.0_11/jre1.6.0_11/lib/amd64/server

(server directory of a JRE, which contains libjvm.so file, and lib directory
of the same JRE.)

Hope this helps,
Rasit

2009/1/21 Arifa Nisar <a-...@u.northwestern.edu>

> Hello,
>
> As I mentioned in my previous email, I am having segmentation fault at
> 0x0000000000000001 while running hdfs_test. I was suggested to build and
> run
> hdfs_test usning ant, as ant should set some environment variable which
> Makefile won't. I tried building libhdfs and running hdfs_test using ant
> but
> I am still having same problem. Now, instead of hdfs_test, I am testing a
> simple test with libhdfs. I linked a following hello world program with
> libhdfs.
>
> #include "hdfs.h"
> int main() {
>  printf("Hello World.\n");
>  return(0);
> }
>
> I added a line to compile this test program in
> ${HADOOP_HOME}/src/c++/libhdfs/Makefile and replaced hdfs_test with this
> test program in {HADOOP_HOME}/src/c++/libhdfs/tests/test-libhdfs.sh. I
> build
> and invoked this test using test-libhdfs target in build.xml but I am still
> having segmentation fault when this simple test program is invoked from
> test-libhdfs.sh. I followed the following steps
>
> cd ${HADOOP_HOME}
> ant clean
> cd ${HADOOP_HOME}/src/c++/libhdfs/
> rm -f hdfs_test hdfs_write hdfs_read libhdfs.so* *.o test
> Cd ${HADOOP_HOME}
> ant test-libhdfs -Dlibhdfs=1
>
> Error Line
> --------------
> [exec] ./tests/test-libhdfs.sh: line 85: 23019 Segmentation fault
> $LIBHDFS_BUILD_DIR/$HDFS_TEST
>
> I have attached the output of this command with this email. I have added
> "env" in test-libhdfs.sh to see what environmental variable are set. Please
> suggest if any variable is wrongly set. Any kind of suggestion will be
> helpful for me as I have already spent a lot of time on this problem.
>
> I have added following lines in Makefile and test-libhdfs.sh
>
> Makefile
> -------------
> export JAVA_HOME=/usr/lib/jvm/java-1.7.0-icedtea-1.7.0.0.x86_64
> export OS_ARCH=amd64
> export OS_NAME=Linux
> export LIBHDFS_BUILD_DIR=$(HADOOP_HOME)/src/c++/libhdfs
> export SHLIB_VERSION=1
>
> test-libhdfs.sh
> ------------------
>
> HADOOP_CONF_DIR=${HADOOP_HOME}/conf
> HADOOP_LOG_DIR=${HADOOP_HOME}/logs
> LIBHDFS_BUILD_DIR=${HADOOP_HOME}/src/c++/libhdfs
> HDFS_TEST=test
>
> When I don't link libdhfs with test.c it doesn't give error and prints
> "Hello World" when "ant test-libhdfs -Dlibhdfs=1" is run. I made sure that
> "ant" and "hadoop" uses same java installation, I have tried this on 32 bit
> machine but I am still having segmentation fault. Now, I am clueless what I
> can do to correct this. Please help.
>
> Thanks,
> Arifa.
>
> PS: Also please suggest is there any java version of hdfs_test?
>
> -----Original Message-----
> From: Delip Rao [mailto:deliprao@gmail.com]
> Sent: Saturday, January 17, 2009 3:49 PM
> To: core-user@hadoop.apache.org
> Subject: Re: Problem running unning hdfs_test
>
> Try enabling the debug flags while compiling to get more information.
>
> On Sat, Jan 17, 2009 at 4:19 AM, Arifa Nisar <a-...@u.northwestern.edu>
> wrote:
> > Hello all,
> >
> >
> >
> > I am trying to test hdfs_test.c provided with hadoop installation.
> > libhdfs.so and hdfs_test are built fine after making a few  changes in
> > $(HADOOP_HOME)/src/c++/libhdfs/Makefile. But when I try to run
> ./hdfs_test,
> > I get segmentation fault at 0x0000000000000001
> >
> >
> >
> > Program received signal SIGSEGV, Segmentation fault.
> >
> > 0x0000000000000001 in ?? ()
> >
> > (gdb) bt
> >
> > #0  0x0000000000000001 in ?? ()
> >
> > #1  0x00007fffd0c51af5 in ?? ()
> >
> > #2  0x0000000000000000 in ?? ()
> >
> >
> >
> > A simple hello world program linked with libdhfs.so also gives the same
> > error. In CLASSPATH all the jar files in $(HADOOP_HOME),
> > $(HADOOP_HOME)/conf, $(HADOOP_HOME)/lib,$(JAVA_HOME)/lib are included.
> > Please help.
> >
> >
> >
> > Thanks,
> >
> > Arifa.
> >
> >
> >
> >
>



-- 
M. Raşit ÖZDAŞ


Re: Problem running hdfs_test

Posted by Rasit OZDAS <ra...@gmail.com>.
Hi, Arifa

I had to add "LD_LIBRARY_PATH" env. var. to correctly run my example.
But I have no idea if it helps, because my error wasn't a segmentation
fault. I would try it anyway.

LD_LIBRARY_PATH:/usr/JRE/jre1.6.0_11/jre1.6.0_11/lib:/usr/JRE/jre1.6.0_11/jre1.6.0_11/lib/amd64/server

(server directory of a JRE, which contains libjvm.so file, and lib directory
of the same JRE.)

Hope this helps,
Rasit

2009/1/21 Arifa Nisar <a-...@u.northwestern.edu>

> Hello,
>
> As I mentioned in my previous email, I am having segmentation fault at
> 0x0000000000000001 while running hdfs_test. I was suggested to build and
> run
> hdfs_test usning ant, as ant should set some environment variable which
> Makefile won't. I tried building libhdfs and running hdfs_test using ant
> but
> I am still having same problem. Now, instead of hdfs_test, I am testing a
> simple test with libhdfs. I linked a following hello world program with
> libhdfs.
>
> #include "hdfs.h"
> int main() {
>  printf("Hello World.\n");
>  return(0);
> }
>
> I added a line to compile this test program in
> ${HADOOP_HOME}/src/c++/libhdfs/Makefile and replaced hdfs_test with this
> test program in {HADOOP_HOME}/src/c++/libhdfs/tests/test-libhdfs.sh. I
> build
> and invoked this test using test-libhdfs target in build.xml but I am still
> having segmentation fault when this simple test program is invoked from
> test-libhdfs.sh. I followed the following steps
>
> cd ${HADOOP_HOME}
> ant clean
> cd ${HADOOP_HOME}/src/c++/libhdfs/
> rm -f hdfs_test hdfs_write hdfs_read libhdfs.so* *.o test
> Cd ${HADOOP_HOME}
> ant test-libhdfs -Dlibhdfs=1
>
> Error Line
> --------------
> [exec] ./tests/test-libhdfs.sh: line 85: 23019 Segmentation fault
> $LIBHDFS_BUILD_DIR/$HDFS_TEST
>
> I have attached the output of this command with this email. I have added
> "env" in test-libhdfs.sh to see what environmental variable are set. Please
> suggest if any variable is wrongly set. Any kind of suggestion will be
> helpful for me as I have already spent a lot of time on this problem.
>
> I have added following lines in Makefile and test-libhdfs.sh
>
> Makefile
> -------------
> export JAVA_HOME=/usr/lib/jvm/java-1.7.0-icedtea-1.7.0.0.x86_64
> export OS_ARCH=amd64
> export OS_NAME=Linux
> export LIBHDFS_BUILD_DIR=$(HADOOP_HOME)/src/c++/libhdfs
> export SHLIB_VERSION=1
>
> test-libhdfs.sh
> ------------------
>
> HADOOP_CONF_DIR=${HADOOP_HOME}/conf
> HADOOP_LOG_DIR=${HADOOP_HOME}/logs
> LIBHDFS_BUILD_DIR=${HADOOP_HOME}/src/c++/libhdfs
> HDFS_TEST=test
>
> When I don't link libdhfs with test.c it doesn't give error and prints
> "Hello World" when "ant test-libhdfs -Dlibhdfs=1" is run. I made sure that
> "ant" and "hadoop" uses same java installation, I have tried this on 32 bit
> machine but I am still having segmentation fault. Now, I am clueless what I
> can do to correct this. Please help.
>
> Thanks,
> Arifa.
>
> PS: Also please suggest is there any java version of hdfs_test?
>
> -----Original Message-----
> From: Delip Rao [mailto:deliprao@gmail.com]
> Sent: Saturday, January 17, 2009 3:49 PM
> To: core-user@hadoop.apache.org
> Subject: Re: Problem running unning hdfs_test
>
> Try enabling the debug flags while compiling to get more information.
>
> On Sat, Jan 17, 2009 at 4:19 AM, Arifa Nisar <a-...@u.northwestern.edu>
> wrote:
> > Hello all,
> >
> >
> >
> > I am trying to test hdfs_test.c provided with hadoop installation.
> > libhdfs.so and hdfs_test are built fine after making a few  changes in
> > $(HADOOP_HOME)/src/c++/libhdfs/Makefile. But when I try to run
> ./hdfs_test,
> > I get segmentation fault at 0x0000000000000001
> >
> >
> >
> > Program received signal SIGSEGV, Segmentation fault.
> >
> > 0x0000000000000001 in ?? ()
> >
> > (gdb) bt
> >
> > #0  0x0000000000000001 in ?? ()
> >
> > #1  0x00007fffd0c51af5 in ?? ()
> >
> > #2  0x0000000000000000 in ?? ()
> >
> >
> >
> > A simple hello world program linked with libdhfs.so also gives the same
> > error. In CLASSPATH all the jar files in $(HADOOP_HOME),
> > $(HADOOP_HOME)/conf, $(HADOOP_HOME)/lib,$(JAVA_HOME)/lib are included.
> > Please help.
> >
> >
> >
> > Thanks,
> >
> > Arifa.
> >
> >
> >
> >
>



-- 
M. Raşit ÖZDAŞ

RE: Problem running hdfs_test

Posted by Arifa Nisar <a-...@u.northwestern.edu>.
Hello,

As I mentioned in my previous email, I am having segmentation fault at
0x0000000000000001 while running hdfs_test. I was suggested to build and run
hdfs_test usning ant, as ant should set some environment variable which
Makefile won't. I tried building libhdfs and running hdfs_test using ant but
I am still having same problem. Now, instead of hdfs_test, I am testing a
simple test with libhdfs. I linked a following hello world program with
libhdfs.

#include "hdfs.h"
int main() {
  printf("Hello World.\n");
  return(0);
}

I added a line to compile this test program in
${HADOOP_HOME}/src/c++/libhdfs/Makefile and replaced hdfs_test with this
test program in {HADOOP_HOME}/src/c++/libhdfs/tests/test-libhdfs.sh. I build
and invoked this test using test-libhdfs target in build.xml but I am still
having segmentation fault when this simple test program is invoked from
test-libhdfs.sh. I followed the following steps

cd ${HADOOP_HOME}
ant clean
cd ${HADOOP_HOME}/src/c++/libhdfs/
rm -f hdfs_test hdfs_write hdfs_read libhdfs.so* *.o test
Cd ${HADOOP_HOME}
ant test-libhdfs -Dlibhdfs=1

Error Line
--------------
[exec] ./tests/test-libhdfs.sh: line 85: 23019 Segmentation fault
$LIBHDFS_BUILD_DIR/$HDFS_TEST

I have attached the output of this command with this email. I have added
"env" in test-libhdfs.sh to see what environmental variable are set. Please
suggest if any variable is wrongly set. Any kind of suggestion will be
helpful for me as I have already spent a lot of time on this problem. 

I have added following lines in Makefile and test-libhdfs.sh

Makefile
-------------
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-icedtea-1.7.0.0.x86_64
export OS_ARCH=amd64
export OS_NAME=Linux
export LIBHDFS_BUILD_DIR=$(HADOOP_HOME)/src/c++/libhdfs
export SHLIB_VERSION=1

test-libhdfs.sh
------------------

HADOOP_CONF_DIR=${HADOOP_HOME}/conf
HADOOP_LOG_DIR=${HADOOP_HOME}/logs
LIBHDFS_BUILD_DIR=${HADOOP_HOME}/src/c++/libhdfs
HDFS_TEST=test

When I don't link libdhfs with test.c it doesn't give error and prints
"Hello World" when "ant test-libhdfs -Dlibhdfs=1" is run. I made sure that
"ant" and "hadoop" uses same java installation, I have tried this on 32 bit
machine but I am still having segmentation fault. Now, I am clueless what I
can do to correct this. Please help.

Thanks,
Arifa.

PS: Also please suggest is there any java version of hdfs_test?  

-----Original Message-----
From: Delip Rao [mailto:deliprao@gmail.com] 
Sent: Saturday, January 17, 2009 3:49 PM
To: core-user@hadoop.apache.org
Subject: Re: Problem running unning hdfs_test

Try enabling the debug flags while compiling to get more information.

On Sat, Jan 17, 2009 at 4:19 AM, Arifa Nisar <a-...@u.northwestern.edu>
wrote:
> Hello all,
>
>
>
> I am trying to test hdfs_test.c provided with hadoop installation.
> libhdfs.so and hdfs_test are built fine after making a few  changes in
> $(HADOOP_HOME)/src/c++/libhdfs/Makefile. But when I try to run
./hdfs_test,
> I get segmentation fault at 0x0000000000000001
>
>
>
> Program received signal SIGSEGV, Segmentation fault.
>
> 0x0000000000000001 in ?? ()
>
> (gdb) bt
>
> #0  0x0000000000000001 in ?? ()
>
> #1  0x00007fffd0c51af5 in ?? ()
>
> #2  0x0000000000000000 in ?? ()
>
>
>
> A simple hello world program linked with libdhfs.so also gives the same
> error. In CLASSPATH all the jar files in $(HADOOP_HOME),
> $(HADOOP_HOME)/conf, $(HADOOP_HOME)/lib,$(JAVA_HOME)/lib are included.
> Please help.
>
>
>
> Thanks,
>
> Arifa.
>
>
>
>

Re: Problem running unning hdfs_test

Posted by Delip Rao <de...@gmail.com>.
Try enabling the debug flags while compiling to get more information.

On Sat, Jan 17, 2009 at 4:19 AM, Arifa Nisar <a-...@u.northwestern.edu> wrote:
> Hello all,
>
>
>
> I am trying to test hdfs_test.c provided with hadoop installation.
> libhdfs.so and hdfs_test are built fine after making a few  changes in
> $(HADOOP_HOME)/src/c++/libhdfs/Makefile. But when I try to run ./hdfs_test,
> I get segmentation fault at 0x0000000000000001
>
>
>
> Program received signal SIGSEGV, Segmentation fault.
>
> 0x0000000000000001 in ?? ()
>
> (gdb) bt
>
> #0  0x0000000000000001 in ?? ()
>
> #1  0x00007fffd0c51af5 in ?? ()
>
> #2  0x0000000000000000 in ?? ()
>
>
>
> A simple hello world program linked with libdhfs.so also gives the same
> error. In CLASSPATH all the jar files in $(HADOOP_HOME),
> $(HADOOP_HOME)/conf, $(HADOOP_HOME)/lib,$(JAVA_HOME)/lib are included.
> Please help.
>
>
>
> Thanks,
>
> Arifa.
>
>
>
>

Re: Problem running unning hdfs_test

Posted by Anum <mi...@gmail.com>.
Hi Folks 

I have a bit similar problem , but running Hadoop with java JDK 1.6 on
fedora linux.
Have resolved much of it but have some question , kindly need some answers
related to it.

1) During compile , why ,the ANT requires (-D parameter value) , to my
knowledge its java system setting property.
2) And after build through ant , the example folder is empty, why is that ,
it might be related to the above example.

Kindly need some guidance to it.


Thanks .







Arifa Nisar wrote:
> 
> Hello all,
> 
>  
> 
> I am trying to test hdfs_test.c provided with hadoop installation.
> libhdfs.so and hdfs_test are built fine after making a few  changes in
> $(HADOOP_HOME)/src/c++/libhdfs/Makefile. But when I try to run
> ./hdfs_test,
> I get segmentation fault at 0x0000000000000001
> 
>  
> 
> Program received signal SIGSEGV, Segmentation fault.
> 
> 0x0000000000000001 in ?? ()
> 
> (gdb) bt
> 
> #0  0x0000000000000001 in ?? ()
> 
> #1  0x00007fffd0c51af5 in ?? ()
> 
> #2  0x0000000000000000 in ?? ()
> 
>  
> 
> A simple hello world program linked with libdhfs.so also gives the same
> error. In CLASSPATH all the jar files in $(HADOOP_HOME),
> $(HADOOP_HOME)/conf, $(HADOOP_HOME)/lib,$(JAVA_HOME)/lib are included.
> Please help.
> 
>  
> 
> Thanks,
> 
> Arifa.
> 
>  
> 
> 
> 

-- 
View this message in context: http://www.nabble.com/Problem-running-unning-hdfs_test-tp21514502p21639312.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.