You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Ahmed Eldawy <as...@gmail.com> on 2013/12/12 08:52:34 UTC

Error starting hadoop-2.2.0

Hi,
 I've been using Hadoop 1.x for a few months and it was working fine. Now,
I want to migrate to hadoop-2.x but I'm having troubles starting it. In
Hadoop 1.x, I used to configure core-site.xml and mapred-site.xml to be
able to start master and slave on one machine.
In hadoop-2.2.0, I followed the instructions on
http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/SingleCluster.html
Whenever I start yarn or HDFS I find this error in the logs
java.lang.NoSuchMethodError:
org.slf4j.helpers.MessageFormatter.format(Ljava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)Lorg/slf4j/helpers/FormattingTuple;
        at org.slf4j.impl.Log4jLoggerAdapter.info
(Log4jLoggerAdapter.java:345)
        at org.mortbay.log.Slf4jLog.info(Slf4jLog.java:67)
        at org.mortbay.log.Log.<clinit>(Log.java:79)
        at org.mortbay.component.Container.add(Container.java:200)
        at org.mortbay.component.Container.update(Container.java:164)
        at org.mortbay.component.Container.update(Container.java:106)
        at org.mortbay.jetty.Server.setConnectors(Server.java:160)
        at org.mortbay.jetty.Server.addConnector(Server.java:134)
        at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:241)
        at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:174)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:305)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:664)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:259)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1727)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1642)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858)
2013-12-12 10:20:12,541 INFO org.apache.hadoop.util.ExitUtil: Exiting with
status 1

Also there is a warning that seems to be related.
SLF4J: The requested version 1.6.99 by your slf4j binding is not compatible
with [1.5.5, 1.5.6]
SLF4J: See http://www.slf4j.org/codes.html#version_mismatch for further
details.

Any suggestions of how to fix it?


Best regards,
Ahmed Eldawy

Re: Error starting hadoop-2.2.0

Posted by Hardik Pandya <sm...@gmail.com>.
do you have multiple or mixed version SLF4J jars in your classpath, how
about downgrading your SLF4J to 1.5.5 or 1.5.6?

please let me know how it works out for you, thanks

from the warning
slf4j-api version does not match that of the binding


An SLF4J binding designates an artifact such as *slf4j-jdk14.jar* or
*slf4j-log4j12.jar* used to *bind* slf4j to an underlying logging
framework, say, java.util.logging and respectively log4j.

Mixing mixing different versions of *slf4j-api.jar* and SLF4J binding can
cause problems. For example, if you are using slf4j-api-1.7.5.jar, then you
should also use slf4j-simple-1.7.5.jar, using slf4j-simple-1.5.5.jar will
not work.

NOTE From the client's perspective all versions of slf4j-api are
compatible. Client code compiled with *slf4j-api-N.jar* will run perfectly
fine with *slf4j-api-M.jar* for any N and M. You only need to ensure that
the version of your binding matches that of the slf4j-api.jar. You do not
have to worry about the version of slf4j-api.jar used by a given dependency
in your project. You can always use any version of *slf4j-api.jar*, and as
long as the version of *slf4j-api.jar* and its binding match, you should be
fine.

At initialization time, if SLF4J suspects that there may be a api vs.
binding version mismatch problem, it will emit a warning about the
suspected mismatch.


On Thu, Dec 12, 2013 at 2:52 AM, Ahmed Eldawy <as...@gmail.com> wrote:

> Hi,
>  I've been using Hadoop 1.x for a few months and it was working fine. Now,
> I want to migrate to hadoop-2.x but I'm having troubles starting it. In
> Hadoop 1.x, I used to configure core-site.xml and mapred-site.xml to be
> able to start master and slave on one machine.
> In hadoop-2.2.0, I followed the instructions on
> http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/SingleCluster.html
> Whenever I start yarn or HDFS I find this error in the logs
> java.lang.NoSuchMethodError:
> org.slf4j.helpers.MessageFormatter.format(Ljava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)Lorg/slf4j/helpers/FormattingTuple;
>         at org.slf4j.impl.Log4jLoggerAdapter.info
> (Log4jLoggerAdapter.java:345)
>         at org.mortbay.log.Slf4jLog.info(Slf4jLog.java:67)
>         at org.mortbay.log.Log.<clinit>(Log.java:79)
>         at org.mortbay.component.Container.add(Container.java:200)
>         at org.mortbay.component.Container.update(Container.java:164)
>         at org.mortbay.component.Container.update(Container.java:106)
>         at org.mortbay.jetty.Server.setConnectors(Server.java:160)
>         at org.mortbay.jetty.Server.addConnector(Server.java:134)
>         at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:241)
>         at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:174)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:305)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:664)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:259)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1727)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1642)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858)
> 2013-12-12 10:20:12,541 INFO org.apache.hadoop.util.ExitUtil: Exiting with
> status 1
>
> Also there is a warning that seems to be related.
> SLF4J: The requested version 1.6.99 by your slf4j binding is not
> compatible with [1.5.5, 1.5.6]
> SLF4J: See http://www.slf4j.org/codes.html#version_mismatch for further
> details.
>
> Any suggestions of how to fix it?
>
>
> Best regards,
> Ahmed Eldawy
>

Re: Error starting hadoop-2.2.0

Posted by Hardik Pandya <sm...@gmail.com>.
do you have multiple or mixed version SLF4J jars in your classpath, how
about downgrading your SLF4J to 1.5.5 or 1.5.6?

please let me know how it works out for you, thanks

from the warning
slf4j-api version does not match that of the binding


An SLF4J binding designates an artifact such as *slf4j-jdk14.jar* or
*slf4j-log4j12.jar* used to *bind* slf4j to an underlying logging
framework, say, java.util.logging and respectively log4j.

Mixing mixing different versions of *slf4j-api.jar* and SLF4J binding can
cause problems. For example, if you are using slf4j-api-1.7.5.jar, then you
should also use slf4j-simple-1.7.5.jar, using slf4j-simple-1.5.5.jar will
not work.

NOTE From the client's perspective all versions of slf4j-api are
compatible. Client code compiled with *slf4j-api-N.jar* will run perfectly
fine with *slf4j-api-M.jar* for any N and M. You only need to ensure that
the version of your binding matches that of the slf4j-api.jar. You do not
have to worry about the version of slf4j-api.jar used by a given dependency
in your project. You can always use any version of *slf4j-api.jar*, and as
long as the version of *slf4j-api.jar* and its binding match, you should be
fine.

At initialization time, if SLF4J suspects that there may be a api vs.
binding version mismatch problem, it will emit a warning about the
suspected mismatch.


On Thu, Dec 12, 2013 at 2:52 AM, Ahmed Eldawy <as...@gmail.com> wrote:

> Hi,
>  I've been using Hadoop 1.x for a few months and it was working fine. Now,
> I want to migrate to hadoop-2.x but I'm having troubles starting it. In
> Hadoop 1.x, I used to configure core-site.xml and mapred-site.xml to be
> able to start master and slave on one machine.
> In hadoop-2.2.0, I followed the instructions on
> http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/SingleCluster.html
> Whenever I start yarn or HDFS I find this error in the logs
> java.lang.NoSuchMethodError:
> org.slf4j.helpers.MessageFormatter.format(Ljava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)Lorg/slf4j/helpers/FormattingTuple;
>         at org.slf4j.impl.Log4jLoggerAdapter.info
> (Log4jLoggerAdapter.java:345)
>         at org.mortbay.log.Slf4jLog.info(Slf4jLog.java:67)
>         at org.mortbay.log.Log.<clinit>(Log.java:79)
>         at org.mortbay.component.Container.add(Container.java:200)
>         at org.mortbay.component.Container.update(Container.java:164)
>         at org.mortbay.component.Container.update(Container.java:106)
>         at org.mortbay.jetty.Server.setConnectors(Server.java:160)
>         at org.mortbay.jetty.Server.addConnector(Server.java:134)
>         at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:241)
>         at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:174)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:305)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:664)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:259)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1727)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1642)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858)
> 2013-12-12 10:20:12,541 INFO org.apache.hadoop.util.ExitUtil: Exiting with
> status 1
>
> Also there is a warning that seems to be related.
> SLF4J: The requested version 1.6.99 by your slf4j binding is not
> compatible with [1.5.5, 1.5.6]
> SLF4J: See http://www.slf4j.org/codes.html#version_mismatch for further
> details.
>
> Any suggestions of how to fix it?
>
>
> Best regards,
> Ahmed Eldawy
>

Re: Error starting hadoop-2.2.0

Posted by Hardik Pandya <sm...@gmail.com>.
do you have multiple or mixed version SLF4J jars in your classpath, how
about downgrading your SLF4J to 1.5.5 or 1.5.6?

please let me know how it works out for you, thanks

from the warning
slf4j-api version does not match that of the binding


An SLF4J binding designates an artifact such as *slf4j-jdk14.jar* or
*slf4j-log4j12.jar* used to *bind* slf4j to an underlying logging
framework, say, java.util.logging and respectively log4j.

Mixing mixing different versions of *slf4j-api.jar* and SLF4J binding can
cause problems. For example, if you are using slf4j-api-1.7.5.jar, then you
should also use slf4j-simple-1.7.5.jar, using slf4j-simple-1.5.5.jar will
not work.

NOTE From the client's perspective all versions of slf4j-api are
compatible. Client code compiled with *slf4j-api-N.jar* will run perfectly
fine with *slf4j-api-M.jar* for any N and M. You only need to ensure that
the version of your binding matches that of the slf4j-api.jar. You do not
have to worry about the version of slf4j-api.jar used by a given dependency
in your project. You can always use any version of *slf4j-api.jar*, and as
long as the version of *slf4j-api.jar* and its binding match, you should be
fine.

At initialization time, if SLF4J suspects that there may be a api vs.
binding version mismatch problem, it will emit a warning about the
suspected mismatch.


On Thu, Dec 12, 2013 at 2:52 AM, Ahmed Eldawy <as...@gmail.com> wrote:

> Hi,
>  I've been using Hadoop 1.x for a few months and it was working fine. Now,
> I want to migrate to hadoop-2.x but I'm having troubles starting it. In
> Hadoop 1.x, I used to configure core-site.xml and mapred-site.xml to be
> able to start master and slave on one machine.
> In hadoop-2.2.0, I followed the instructions on
> http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/SingleCluster.html
> Whenever I start yarn or HDFS I find this error in the logs
> java.lang.NoSuchMethodError:
> org.slf4j.helpers.MessageFormatter.format(Ljava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)Lorg/slf4j/helpers/FormattingTuple;
>         at org.slf4j.impl.Log4jLoggerAdapter.info
> (Log4jLoggerAdapter.java:345)
>         at org.mortbay.log.Slf4jLog.info(Slf4jLog.java:67)
>         at org.mortbay.log.Log.<clinit>(Log.java:79)
>         at org.mortbay.component.Container.add(Container.java:200)
>         at org.mortbay.component.Container.update(Container.java:164)
>         at org.mortbay.component.Container.update(Container.java:106)
>         at org.mortbay.jetty.Server.setConnectors(Server.java:160)
>         at org.mortbay.jetty.Server.addConnector(Server.java:134)
>         at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:241)
>         at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:174)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:305)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:664)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:259)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1727)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1642)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858)
> 2013-12-12 10:20:12,541 INFO org.apache.hadoop.util.ExitUtil: Exiting with
> status 1
>
> Also there is a warning that seems to be related.
> SLF4J: The requested version 1.6.99 by your slf4j binding is not
> compatible with [1.5.5, 1.5.6]
> SLF4J: See http://www.slf4j.org/codes.html#version_mismatch for further
> details.
>
> Any suggestions of how to fix it?
>
>
> Best regards,
> Ahmed Eldawy
>

Re: Error starting hadoop-2.2.0

Posted by Hardik Pandya <sm...@gmail.com>.
do you have multiple or mixed version SLF4J jars in your classpath, how
about downgrading your SLF4J to 1.5.5 or 1.5.6?

please let me know how it works out for you, thanks

from the warning
slf4j-api version does not match that of the binding


An SLF4J binding designates an artifact such as *slf4j-jdk14.jar* or
*slf4j-log4j12.jar* used to *bind* slf4j to an underlying logging
framework, say, java.util.logging and respectively log4j.

Mixing mixing different versions of *slf4j-api.jar* and SLF4J binding can
cause problems. For example, if you are using slf4j-api-1.7.5.jar, then you
should also use slf4j-simple-1.7.5.jar, using slf4j-simple-1.5.5.jar will
not work.

NOTE From the client's perspective all versions of slf4j-api are
compatible. Client code compiled with *slf4j-api-N.jar* will run perfectly
fine with *slf4j-api-M.jar* for any N and M. You only need to ensure that
the version of your binding matches that of the slf4j-api.jar. You do not
have to worry about the version of slf4j-api.jar used by a given dependency
in your project. You can always use any version of *slf4j-api.jar*, and as
long as the version of *slf4j-api.jar* and its binding match, you should be
fine.

At initialization time, if SLF4J suspects that there may be a api vs.
binding version mismatch problem, it will emit a warning about the
suspected mismatch.


On Thu, Dec 12, 2013 at 2:52 AM, Ahmed Eldawy <as...@gmail.com> wrote:

> Hi,
>  I've been using Hadoop 1.x for a few months and it was working fine. Now,
> I want to migrate to hadoop-2.x but I'm having troubles starting it. In
> Hadoop 1.x, I used to configure core-site.xml and mapred-site.xml to be
> able to start master and slave on one machine.
> In hadoop-2.2.0, I followed the instructions on
> http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/SingleCluster.html
> Whenever I start yarn or HDFS I find this error in the logs
> java.lang.NoSuchMethodError:
> org.slf4j.helpers.MessageFormatter.format(Ljava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)Lorg/slf4j/helpers/FormattingTuple;
>         at org.slf4j.impl.Log4jLoggerAdapter.info
> (Log4jLoggerAdapter.java:345)
>         at org.mortbay.log.Slf4jLog.info(Slf4jLog.java:67)
>         at org.mortbay.log.Log.<clinit>(Log.java:79)
>         at org.mortbay.component.Container.add(Container.java:200)
>         at org.mortbay.component.Container.update(Container.java:164)
>         at org.mortbay.component.Container.update(Container.java:106)
>         at org.mortbay.jetty.Server.setConnectors(Server.java:160)
>         at org.mortbay.jetty.Server.addConnector(Server.java:134)
>         at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:241)
>         at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:174)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:305)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:664)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:259)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1727)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1642)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858)
> 2013-12-12 10:20:12,541 INFO org.apache.hadoop.util.ExitUtil: Exiting with
> status 1
>
> Also there is a warning that seems to be related.
> SLF4J: The requested version 1.6.99 by your slf4j binding is not
> compatible with [1.5.5, 1.5.6]
> SLF4J: See http://www.slf4j.org/codes.html#version_mismatch for further
> details.
>
> Any suggestions of how to fix it?
>
>
> Best regards,
> Ahmed Eldawy
>