You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Adarsh Sharma <ad...@orkash.com> on 2011/04/28 06:31:11 UTC

Running C hdfs Code in Hadoop

Dear all,

Today I am trying to run a simple code by following the below tutorial :-


http://hadoop.apache.org/hdfs/docs/current/libhdfs.html

I followed the below steps :-

1. Set LD_LIBRARY_PATH & CLASSPATH as :

export 
LD_LIBRARY_PATH=/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib:/usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so
export 
CLASSPATH=$CLASSPATH:$HADOOP_HOME:$HADOOP_HOME/lib:/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib:/usr/java/jdk1.6.0_18/jre/lib/amd64

2. write above_sample.c program & put it into 
$HADOOP_HOME/src/c++/libhdfs directory

3. After compiling with the below command I am facing issues as :-


 gcc above_sample.c -I/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs 
-L/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs 
-L/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib 
-L$HADOOP_HOME/c++/Linux-amd64-64/lib/libhdfs.so.0 -lhdfs -I$HADOOP_HOME 
-I$HADOOP_HOME/lib /usr/java/jdk1.6.0_18/jre/lib/amd64/server/libjvm.so 
/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib/libhdfs.so.0 
-o above_sample

bash-3.2$ ./above_sample
Error occurred during initialization of VM
Unable to load native library: 
/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/libjava.so: cannot 
open shared object file: No such file or directory


Now when I try the below command :

 gcc above_sample.c -I/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs 
-L/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs 
-L/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib 
-L$HADOOP_HOME/c++/Linux-amd64-64/lib/libhdfs.so.0 -lhdfs -I$HADOOP_HOME 
-I$HADOOP_HOME/lib /usr/java/jdk1.6.0_18/jre/lib/amd64/server/libjvm.so 
/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib/libhdfs.so.0 
/usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so -o above_sample
/usr/bin/ld: warning: libverify.so, needed by 
/usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so, not found (try using 
-rpath or -rpath-link)
/usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to 
`VerifyClassname@SUNWprivate_1.1'
/usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to 
`VerifyClassForMajorVersion@SUNWprivate_1.1'
/usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to 
`VerifyFixClassname@SUNWprivate_1.1'
/usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to 
`VerifyClass@SUNWprivate_1.1'
collect2: ld returned 1 exit status
bash-3.2$

Can Someone guide me the steps needed to run a libhdfs simple program in 
Hadoop Cluster

Thanks & best Regards
Adarsh Sharma


Re: Running C hdfs Code in Hadoop

Posted by Adarsh Sharma <ad...@orkash.com>.
Thanks Brain, It works ..
Looking Forward in future problems

Thanks Once again

Brian Bockelman wrote:
> Hi Adarsh,
>
> It appears you don't have the JVM libraries in your LD_LIBRARY_PATH.  Try this:
>
> export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$JAVA_HOME/jre/lib/amd64:$JAVA_HOME/jre/lib/amd64/server
>
> Brian
>
> On Apr 27, 2011, at 11:31 PM, Adarsh Sharma wrote:
>
>   
>> Dear all,
>>
>> Today I am trying to run a simple code by following the below tutorial :-
>>
>>
>> http://hadoop.apache.org/hdfs/docs/current/libhdfs.html
>>
>> I followed the below steps :-
>>
>> 1. Set LD_LIBRARY_PATH & CLASSPATH as :
>>
>> export LD_LIBRARY_PATH=/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib:/usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so
>> export CLASSPATH=$CLASSPATH:$HADOOP_HOME:$HADOOP_HOME/lib:/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib:/usr/java/jdk1.6.0_18/jre/lib/amd64
>>
>> 2. write above_sample.c program & put it into $HADOOP_HOME/src/c++/libhdfs directory
>>
>> 3. After compiling with the below command I am facing issues as :-
>>
>>
>> gcc above_sample.c -I/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs -L/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs -L/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib -L$HADOOP_HOME/c++/Linux-amd64-64/lib/libhdfs.so.0 -lhdfs -I$HADOOP_HOME -I$HADOOP_HOME/lib /usr/java/jdk1.6.0_18/jre/lib/amd64/server/libjvm.so /home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib/libhdfs.so.0 -o above_sample
>>
>> bash-3.2$ ./above_sample
>> Error occurred during initialization of VM
>> Unable to load native library: /home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/libjava.so: cannot open shared object file: No such file or directory
>>
>>
>> Now when I try the below command :
>>
>> gcc above_sample.c -I/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs -L/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs -L/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib -L$HADOOP_HOME/c++/Linux-amd64-64/lib/libhdfs.so.0 -lhdfs -I$HADOOP_HOME -I$HADOOP_HOME/lib /usr/java/jdk1.6.0_18/jre/lib/amd64/server/libjvm.so /home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib/libhdfs.so.0 /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so -o above_sample
>> /usr/bin/ld: warning: libverify.so, needed by /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so, not found (try using -rpath or -rpath-link)
>> /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to `VerifyClassname@SUNWprivate_1.1'
>> /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to `VerifyClassForMajorVersion@SUNWprivate_1.1'
>> /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to `VerifyFixClassname@SUNWprivate_1.1'
>> /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to `VerifyClass@SUNWprivate_1.1'
>> collect2: ld returned 1 exit status
>> bash-3.2$
>>
>> Can Someone guide me the steps needed to run a libhdfs simple program in Hadoop Cluster
>>
>> Thanks & best Regards
>> Adarsh Sharma
>>     
>
>   


Re: Running C hdfs Code in Hadoop

Posted by Brian Bockelman <bb...@cse.unl.edu>.
Hi Adarsh,

It appears you don't have the JVM libraries in your LD_LIBRARY_PATH.  Try this:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$JAVA_HOME/jre/lib/amd64:$JAVA_HOME/jre/lib/amd64/server

Brian

On Apr 27, 2011, at 11:31 PM, Adarsh Sharma wrote:

> Dear all,
> 
> Today I am trying to run a simple code by following the below tutorial :-
> 
> 
> http://hadoop.apache.org/hdfs/docs/current/libhdfs.html
> 
> I followed the below steps :-
> 
> 1. Set LD_LIBRARY_PATH & CLASSPATH as :
> 
> export LD_LIBRARY_PATH=/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib:/usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so
> export CLASSPATH=$CLASSPATH:$HADOOP_HOME:$HADOOP_HOME/lib:/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib:/usr/java/jdk1.6.0_18/jre/lib/amd64
> 
> 2. write above_sample.c program & put it into $HADOOP_HOME/src/c++/libhdfs directory
> 
> 3. After compiling with the below command I am facing issues as :-
> 
> 
> gcc above_sample.c -I/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs -L/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs -L/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib -L$HADOOP_HOME/c++/Linux-amd64-64/lib/libhdfs.so.0 -lhdfs -I$HADOOP_HOME -I$HADOOP_HOME/lib /usr/java/jdk1.6.0_18/jre/lib/amd64/server/libjvm.so /home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib/libhdfs.so.0 -o above_sample
> 
> bash-3.2$ ./above_sample
> Error occurred during initialization of VM
> Unable to load native library: /home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/libjava.so: cannot open shared object file: No such file or directory
> 
> 
> Now when I try the below command :
> 
> gcc above_sample.c -I/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs -L/home/hadoop/project/hadoop-0.20.2/src/c++/libhdfs -L/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib -L$HADOOP_HOME/c++/Linux-amd64-64/lib/libhdfs.so.0 -lhdfs -I$HADOOP_HOME -I$HADOOP_HOME/lib /usr/java/jdk1.6.0_18/jre/lib/amd64/server/libjvm.so /home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib/libhdfs.so.0 /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so -o above_sample
> /usr/bin/ld: warning: libverify.so, needed by /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so, not found (try using -rpath or -rpath-link)
> /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to `VerifyClassname@SUNWprivate_1.1'
> /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to `VerifyClassForMajorVersion@SUNWprivate_1.1'
> /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to `VerifyFixClassname@SUNWprivate_1.1'
> /usr/java/jdk1.6.0_18/jre/lib/amd64/libjava.so: undefined reference to `VerifyClass@SUNWprivate_1.1'
> collect2: ld returned 1 exit status
> bash-3.2$
> 
> Can Someone guide me the steps needed to run a libhdfs simple program in Hadoop Cluster
> 
> Thanks & best Regards
> Adarsh Sharma