You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@mahout.apache.org by sleefd <sl...@gmail.com> on 2014/10/27 15:47:08 UTC

回复: Re: compatibility of hadoop and mahout version

you have to keep your hadoop installed version similar with the version defined in pom.xml or  the hadoop  in lib/hadoop.If not,you  have to  compile  new  mahout  distribution according  to your  hadoop version.This  can  be  done  with maven command.


从三星移动设备发送

-------- 原始邮件 --------
发件人: jyotiranjan panda <te...@gmail.com> 
日期:2014-10-27  PM6:08  (GMT+08:00) 
收件人: user@mahout.apache.org 
主题: Re: compatibility of hadoop and mahout version 

Thanks Suneel,
Now I changed the hadoop version from 2.3.0 to 1.2.1. and getting new error.
-----------------------------------------------------------------------------------------------------------------
SLF4J: slf4j-api 1.6.x (or later) is incompatible with this binding.
SLF4J: Your binding is version 1.5.5 or earlier.
SLF4J: Upgrade your binding to version 1.6.x.
Exception in thread "main" java.lang.NoSuchMethodError:
org.slf4j.impl.StaticLoggerBinder.getSingleton()Lorg/slf4j/impl/StaticLoggerBinder;
    at org.slf4j.LoggerFactory.bind(LoggerFactory.java:128)
    at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:107)
    at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:295)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:269)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:281)
    at org.apache.mahout.common.AbstractJob.<clinit>(AbstractJob.java:90)
    at
com.jyoti.mahout.HelloWorldClustering.main(HelloWorldClustering.java:104)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:160)

-----------------------------------------------------------------------------------------------------------------
When I google the above , found sugestion to exclude the sf4j, so I
excluded that jar file while compiling in eclipse, still I am facing the
same problem.



On Mon, Oct 27, 2014 at 11:07 AM, Suneel Marthi <sm...@apache.org> wrote:

> Mahout 0.9 is not compatible Hadoop 2.x.  Either u can work off present git
> codebase on HAdoop 2.x or try running Mahout 0.9 on Hadoop 1.2.1
>
> On Mon, Oct 27, 2014 at 1:34 AM, jyotiranjan panda <te...@gmail.com>
> wrote:
>
> > Hi,
> > I have just started mahout learning last week.
> > I am facing lots of problem in executing sample examples in mahout,
> before
> > I ask those errors, I want to confirm about the version compatibility of
> > demons.
> >
> > I am using Apache Hadoop-2.3.0 with Mahout-distrubution-0.9 in
> ubuntu14.04
> > 32 bit Laptop.
> > while running Mahout on command I don’t get any errors and it gives me
> all
> > the valid program names as output.But while executing a clustering
> example
> > it gives error as below.
> >
> >
> ---------------------------------------------------------------------------
> > hduser@localhostjp:/usr/local/hadoop$ hadoop jar mahouttest.jar
> > Exception in thread "main" java.lang.NoClassDefFoundError:
> > org/apache/mahout/math/Vector
> >     at java.lang.Class.forName0(Native Method)
> >     at java.lang.Class.forName(Class.java:270)
> >     at org.apache.hadoop.util.RunJar.main(RunJar.java:205)
> > Caused by: java.lang.ClassNotFoundException:
> org.apache.mahout.math.Vector
> >     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >     ... 3 more
> >
> >
> -----------------------------------------------------------------------------------------------------------
> >
> > Regards
> > JyotiRanjan Panda
> >
>