You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by shubhangi <sh...@oracle.com> on 2013/03/06 10:50:13 UTC

For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

I am writing an application in c++, which uses API provided by libhdfs 
to manipulate Hadoop DFS.
I could run the application with 1.0.4 and 1.1.1; setting classpath 
equal to
$(hadoop classpath).

For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load 
necessary classes required forlibhdfs; as opposed to 1.x versions; 
giving the following error:

loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: 
ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, 
kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: 
ExceptionUtils::getStackTrace error.)

I tried loading the jar files with their full path specified (as opposed 
to wildcard characters used in the classpath); and the application runs, 
but gives the following warning:

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for 
further details.
13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes 
where applicable


Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by Arpit Gupta <ar...@hortonworks.com>.
When you constructed the classpath with the full path did you also add slf4j-log4j12-*.jar(http://www.slf4j.org/codes.html#StaticLoggerBinder) to the classpath. The jar should be in HADOOP_HOME/lib. This should help with SLF4J issue.

> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

How did you install hadoop. Did you use tarballs or rpm's? Do you have so files in HADOOP_HOME/lib/native?

If you used tarballs you might have to rebuild the native code for your OS

https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.0.3-alpha/BUILDING.txt

--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/

On Mar 6, 2013, at 1:50 AM, shubhangi <sh...@oracle.com> wrote:

> I am writing an application in c++, which uses API provided by libhdfs to manipulate Hadoop DFS.
> I could run the application with 1.0.4 and 1.1.1; setting classpath equal to
> $(hadoop classpath).
> 
> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load necessary classes required forlibhdfs; as opposed to 1.x versions; giving the following error:
> 
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
> hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
> 
> I tried loading the jar files with their full path specified (as opposed to wildcard characters used in the classpath); and the application runs, but gives the following warning:
> 
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 


Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by Shumin Guo <gs...@gmail.com>.
You can always print out the hadoop classpath before running the hadoop
command, for example by editing the $HADOOP_HOME/bin/hadoop file.

HTH.

On Wed, Mar 6, 2013 at 5:01 AM, shubhangi <sh...@oracle.com> wrote:

> Hi All,
>
>
>  I am writing an application in c++, which uses API provided by libhdfs to
>> manipulate Hadoop DFS.
>> I could run the application with 1.0.4 and 1.1.1; setting classpath equal
>> to
>> $(hadoop classpath).
>>
>> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load
>> necessary classes required forlibhdfs; as opposed to 1.x versions; giving
>> the following error:
>>
>> loadFileSystems error:
>> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
>> ExceptionUtils::getStackTrace error.)
>> hdfsBuilderConnect(**forceNewInstance=0, nn=default, port=0,
>> kerbTicketCachePath=(NULL), userName=(NULL)) error:
>> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
>> ExceptionUtils::getStackTrace error.)
>>
>> I tried loading the jar files with their full path specified (as opposed
>> to wildcard characters used in the classpath); and the application runs,
>> but gives the following warning:
>>
>> SLF4J: Failed to load class "org.slf4j.impl.**StaticLoggerBinder".
>> SLF4J: Defaulting to no-operation (NOP) logger implementation
>> SLF4J: See http://www.slf4j.org/codes.**html#StaticLoggerBinder<http://www.slf4j.org/codes.html#StaticLoggerBinder>for further details.
>> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load
>> native-hadoop library for your platform... using builtin-java classes where
>> applicable
>>
>>  The environment is:
>
> Environment: Ubuntu 12.04 32 bit, java version "1.7.0_03"
> Hadoop release: 2.0.3-alpha
>
> Any help would be appreciated.
>
> Thank you in advance,
> Shubhangi
>

Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by Shumin Guo <gs...@gmail.com>.
You can always print out the hadoop classpath before running the hadoop
command, for example by editing the $HADOOP_HOME/bin/hadoop file.

HTH.

On Wed, Mar 6, 2013 at 5:01 AM, shubhangi <sh...@oracle.com> wrote:

> Hi All,
>
>
>  I am writing an application in c++, which uses API provided by libhdfs to
>> manipulate Hadoop DFS.
>> I could run the application with 1.0.4 and 1.1.1; setting classpath equal
>> to
>> $(hadoop classpath).
>>
>> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load
>> necessary classes required forlibhdfs; as opposed to 1.x versions; giving
>> the following error:
>>
>> loadFileSystems error:
>> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
>> ExceptionUtils::getStackTrace error.)
>> hdfsBuilderConnect(**forceNewInstance=0, nn=default, port=0,
>> kerbTicketCachePath=(NULL), userName=(NULL)) error:
>> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
>> ExceptionUtils::getStackTrace error.)
>>
>> I tried loading the jar files with their full path specified (as opposed
>> to wildcard characters used in the classpath); and the application runs,
>> but gives the following warning:
>>
>> SLF4J: Failed to load class "org.slf4j.impl.**StaticLoggerBinder".
>> SLF4J: Defaulting to no-operation (NOP) logger implementation
>> SLF4J: See http://www.slf4j.org/codes.**html#StaticLoggerBinder<http://www.slf4j.org/codes.html#StaticLoggerBinder>for further details.
>> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load
>> native-hadoop library for your platform... using builtin-java classes where
>> applicable
>>
>>  The environment is:
>
> Environment: Ubuntu 12.04 32 bit, java version "1.7.0_03"
> Hadoop release: 2.0.3-alpha
>
> Any help would be appreciated.
>
> Thank you in advance,
> Shubhangi
>

Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by Shumin Guo <gs...@gmail.com>.
You can always print out the hadoop classpath before running the hadoop
command, for example by editing the $HADOOP_HOME/bin/hadoop file.

HTH.

On Wed, Mar 6, 2013 at 5:01 AM, shubhangi <sh...@oracle.com> wrote:

> Hi All,
>
>
>  I am writing an application in c++, which uses API provided by libhdfs to
>> manipulate Hadoop DFS.
>> I could run the application with 1.0.4 and 1.1.1; setting classpath equal
>> to
>> $(hadoop classpath).
>>
>> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load
>> necessary classes required forlibhdfs; as opposed to 1.x versions; giving
>> the following error:
>>
>> loadFileSystems error:
>> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
>> ExceptionUtils::getStackTrace error.)
>> hdfsBuilderConnect(**forceNewInstance=0, nn=default, port=0,
>> kerbTicketCachePath=(NULL), userName=(NULL)) error:
>> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
>> ExceptionUtils::getStackTrace error.)
>>
>> I tried loading the jar files with their full path specified (as opposed
>> to wildcard characters used in the classpath); and the application runs,
>> but gives the following warning:
>>
>> SLF4J: Failed to load class "org.slf4j.impl.**StaticLoggerBinder".
>> SLF4J: Defaulting to no-operation (NOP) logger implementation
>> SLF4J: See http://www.slf4j.org/codes.**html#StaticLoggerBinder<http://www.slf4j.org/codes.html#StaticLoggerBinder>for further details.
>> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load
>> native-hadoop library for your platform... using builtin-java classes where
>> applicable
>>
>>  The environment is:
>
> Environment: Ubuntu 12.04 32 bit, java version "1.7.0_03"
> Hadoop release: 2.0.3-alpha
>
> Any help would be appreciated.
>
> Thank you in advance,
> Shubhangi
>

Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by Shumin Guo <gs...@gmail.com>.
You can always print out the hadoop classpath before running the hadoop
command, for example by editing the $HADOOP_HOME/bin/hadoop file.

HTH.

On Wed, Mar 6, 2013 at 5:01 AM, shubhangi <sh...@oracle.com> wrote:

> Hi All,
>
>
>  I am writing an application in c++, which uses API provided by libhdfs to
>> manipulate Hadoop DFS.
>> I could run the application with 1.0.4 and 1.1.1; setting classpath equal
>> to
>> $(hadoop classpath).
>>
>> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load
>> necessary classes required forlibhdfs; as opposed to 1.x versions; giving
>> the following error:
>>
>> loadFileSystems error:
>> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
>> ExceptionUtils::getStackTrace error.)
>> hdfsBuilderConnect(**forceNewInstance=0, nn=default, port=0,
>> kerbTicketCachePath=(NULL), userName=(NULL)) error:
>> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
>> ExceptionUtils::getStackTrace error.)
>>
>> I tried loading the jar files with their full path specified (as opposed
>> to wildcard characters used in the classpath); and the application runs,
>> but gives the following warning:
>>
>> SLF4J: Failed to load class "org.slf4j.impl.**StaticLoggerBinder".
>> SLF4J: Defaulting to no-operation (NOP) logger implementation
>> SLF4J: See http://www.slf4j.org/codes.**html#StaticLoggerBinder<http://www.slf4j.org/codes.html#StaticLoggerBinder>for further details.
>> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load
>> native-hadoop library for your platform... using builtin-java classes where
>> applicable
>>
>>  The environment is:
>
> Environment: Ubuntu 12.04 32 bit, java version "1.7.0_03"
> Hadoop release: 2.0.3-alpha
>
> Any help would be appreciated.
>
> Thank you in advance,
> Shubhangi
>

Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by shubhangi <sh...@oracle.com>.
Hi All,

> I am writing an application in c++, which uses API provided by libhdfs 
> to manipulate Hadoop DFS.
> I could run the application with 1.0.4 and 1.1.1; setting classpath 
> equal to
> $(hadoop classpath).
>
> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load 
> necessary classes required forlibhdfs; as opposed to 1.x versions; 
> giving the following error:
>
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError 
> exception: ExceptionUtils::getStackTrace error.)
> hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, 
> kerbTicketCachePath=(NULL), userName=(NULL)) error:
> (unable to get stack trace for java.lang.NoClassDefFoundError 
> exception: ExceptionUtils::getStackTrace error.)
>
> I tried loading the jar files with their full path specified (as 
> opposed to wildcard characters used in the classpath); and the 
> application runs, but gives the following warning:
>
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for 
> further details.
> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load 
> native-hadoop library for your platform... using builtin-java classes 
> where applicable
>
The environment is:

Environment: Ubuntu 12.04 32 bit, java version "1.7.0_03"
Hadoop release: 2.0.3-alpha

Any help would be appreciated.

Thank you in advance,
Shubhangi

Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by shubhangi <sh...@oracle.com>.
Hi All,

> I am writing an application in c++, which uses API provided by libhdfs 
> to manipulate Hadoop DFS.
> I could run the application with 1.0.4 and 1.1.1; setting classpath 
> equal to
> $(hadoop classpath).
>
> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load 
> necessary classes required forlibhdfs; as opposed to 1.x versions; 
> giving the following error:
>
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError 
> exception: ExceptionUtils::getStackTrace error.)
> hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, 
> kerbTicketCachePath=(NULL), userName=(NULL)) error:
> (unable to get stack trace for java.lang.NoClassDefFoundError 
> exception: ExceptionUtils::getStackTrace error.)
>
> I tried loading the jar files with their full path specified (as 
> opposed to wildcard characters used in the classpath); and the 
> application runs, but gives the following warning:
>
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for 
> further details.
> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load 
> native-hadoop library for your platform... using builtin-java classes 
> where applicable
>
The environment is:

Environment: Ubuntu 12.04 32 bit, java version "1.7.0_03"
Hadoop release: 2.0.3-alpha

Any help would be appreciated.

Thank you in advance,
Shubhangi

Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by shubhangi <sh...@oracle.com>.
Hi All,

> I am writing an application in c++, which uses API provided by libhdfs 
> to manipulate Hadoop DFS.
> I could run the application with 1.0.4 and 1.1.1; setting classpath 
> equal to
> $(hadoop classpath).
>
> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load 
> necessary classes required forlibhdfs; as opposed to 1.x versions; 
> giving the following error:
>
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError 
> exception: ExceptionUtils::getStackTrace error.)
> hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, 
> kerbTicketCachePath=(NULL), userName=(NULL)) error:
> (unable to get stack trace for java.lang.NoClassDefFoundError 
> exception: ExceptionUtils::getStackTrace error.)
>
> I tried loading the jar files with their full path specified (as 
> opposed to wildcard characters used in the classpath); and the 
> application runs, but gives the following warning:
>
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for 
> further details.
> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load 
> native-hadoop library for your platform... using builtin-java classes 
> where applicable
>
The environment is:

Environment: Ubuntu 12.04 32 bit, java version "1.7.0_03"
Hadoop release: 2.0.3-alpha

Any help would be appreciated.

Thank you in advance,
Shubhangi

Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by Arpit Gupta <ar...@hortonworks.com>.
When you constructed the classpath with the full path did you also add slf4j-log4j12-*.jar(http://www.slf4j.org/codes.html#StaticLoggerBinder) to the classpath. The jar should be in HADOOP_HOME/lib. This should help with SLF4J issue.

> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

How did you install hadoop. Did you use tarballs or rpm's? Do you have so files in HADOOP_HOME/lib/native?

If you used tarballs you might have to rebuild the native code for your OS

https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.0.3-alpha/BUILDING.txt

--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/

On Mar 6, 2013, at 1:50 AM, shubhangi <sh...@oracle.com> wrote:

> I am writing an application in c++, which uses API provided by libhdfs to manipulate Hadoop DFS.
> I could run the application with 1.0.4 and 1.1.1; setting classpath equal to
> $(hadoop classpath).
> 
> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load necessary classes required forlibhdfs; as opposed to 1.x versions; giving the following error:
> 
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
> hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
> 
> I tried loading the jar files with their full path specified (as opposed to wildcard characters used in the classpath); and the application runs, but gives the following warning:
> 
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 


Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by Arpit Gupta <ar...@hortonworks.com>.
When you constructed the classpath with the full path did you also add slf4j-log4j12-*.jar(http://www.slf4j.org/codes.html#StaticLoggerBinder) to the classpath. The jar should be in HADOOP_HOME/lib. This should help with SLF4J issue.

> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

How did you install hadoop. Did you use tarballs or rpm's? Do you have so files in HADOOP_HOME/lib/native?

If you used tarballs you might have to rebuild the native code for your OS

https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.0.3-alpha/BUILDING.txt

--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/

On Mar 6, 2013, at 1:50 AM, shubhangi <sh...@oracle.com> wrote:

> I am writing an application in c++, which uses API provided by libhdfs to manipulate Hadoop DFS.
> I could run the application with 1.0.4 and 1.1.1; setting classpath equal to
> $(hadoop classpath).
> 
> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load necessary classes required forlibhdfs; as opposed to 1.x versions; giving the following error:
> 
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
> hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
> 
> I tried loading the jar files with their full path specified (as opposed to wildcard characters used in the classpath); and the application runs, but gives the following warning:
> 
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 


Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by shubhangi <sh...@oracle.com>.
Hi All,

> I am writing an application in c++, which uses API provided by libhdfs 
> to manipulate Hadoop DFS.
> I could run the application with 1.0.4 and 1.1.1; setting classpath 
> equal to
> $(hadoop classpath).
>
> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load 
> necessary classes required forlibhdfs; as opposed to 1.x versions; 
> giving the following error:
>
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError 
> exception: ExceptionUtils::getStackTrace error.)
> hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, 
> kerbTicketCachePath=(NULL), userName=(NULL)) error:
> (unable to get stack trace for java.lang.NoClassDefFoundError 
> exception: ExceptionUtils::getStackTrace error.)
>
> I tried loading the jar files with their full path specified (as 
> opposed to wildcard characters used in the classpath); and the 
> application runs, but gives the following warning:
>
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for 
> further details.
> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load 
> native-hadoop library for your platform... using builtin-java classes 
> where applicable
>
The environment is:

Environment: Ubuntu 12.04 32 bit, java version "1.7.0_03"
Hadoop release: 2.0.3-alpha

Any help would be appreciated.

Thank you in advance,
Shubhangi

Re: For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not work, as opposed to 1.x versions

Posted by Arpit Gupta <ar...@hortonworks.com>.
When you constructed the classpath with the full path did you also add slf4j-log4j12-*.jar(http://www.slf4j.org/codes.html#StaticLoggerBinder) to the classpath. The jar should be in HADOOP_HOME/lib. This should help with SLF4J issue.

> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

How did you install hadoop. Did you use tarballs or rpm's? Do you have so files in HADOOP_HOME/lib/native?

If you used tarballs you might have to rebuild the native code for your OS

https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.0.3-alpha/BUILDING.txt

--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/

On Mar 6, 2013, at 1:50 AM, shubhangi <sh...@oracle.com> wrote:

> I am writing an application in c++, which uses API provided by libhdfs to manipulate Hadoop DFS.
> I could run the application with 1.0.4 and 1.1.1; setting classpath equal to
> $(hadoop classpath).
> 
> For Hadoop 2.0.3; setting CLASSPATH=$(hadoop classpath) does not load necessary classes required forlibhdfs; as opposed to 1.x versions; giving the following error:
> 
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
> hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
> 
> I tried loading the jar files with their full path specified (as opposed to wildcard characters used in the classpath); and the application runs, but gives the following warning:
> 
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
> 13/03/04 11:17:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>